If Ax=λx for some vector x, then λ is an eigenvalue of A.
False. The equation Ax = λx must have a nontrivial solution.
A matrix A is not invertible if and only if 0 is an eigenvalue of A.
True. See the paragraph after Example 5.
A number c is an eigenvalue of A if and only if the equation (A-cl)x=0 has a nontrivial solution.
True. See the discussion of equation (3).
Finding an eigenvector of A may be difficult, but checking whether a given vector is in fact an eigenvector is easy.
True. See Example 2 and the paragraph preceding it.
To find the eigenvalues of A, reduce A to echelon form.
False. See the warning after Example 3.
If Ax=λx for some scalar λ, then x is an eigenvector of A.
False. The vector x in Ax = λx must be nonzero.
If v_{1} and v_{2} are linearly independent eigenvectors, then they correspond to distinct eigenvalues.
False. See Example 4 for a two-dimensional eigenspace, which contains two linearly independenteigenvectors corresponding to the same eigenvalue. The statement given is not at all the same asTheorem 2. In fact, it is the converse of Theorem 2 (for the case r = 2 ).
A steady-state vector for a stochastic matrix is actually an eigenvector.
True. See the paragraph after Example 1.
The eigenvalues of a matrix are on its main diagonal.
False. Theorem 1 concerns a triangular matrix. See Examples 3 and 4 for counterexamples.
An eigenspace of A is a null space of a certain matrix.
True. See the paragraph following Example 3. The eigenspace of A corresponding to λ is the nullspace of the matrix A − λI.
The determinant of A is the product of the diagonal entries in A.
False. See Example 1.
An elementary row operation on A does not change the determinant.
False. See Theorem 3.
(det A)(det B) = det AB
True. See Theorem 3.
If λ+5 is a factor of the characteristic polynomial of A, then 5 is an eigenvalue of A.
False. See the solution of Example 4.
If A is 3x3, with columns a_{1}, a_{2}, a_{3}, then det A equals the volume of the parallelepiped determined by a_{1}, a_{2}, a_{3}.
False. See the paragraph before Theorem 3.
det A^{T} = (-1)det A.
False. See Theorem 3.
The multiplicity of a root r of the characteristic equation of A is called the algebraic multiplicity of r as an eigenvalue of A.
True. See the paragraph before Example 4.
A row replacement operation of A does not change the eigenvalues.
False. See the warning after Theorem 4.
Inner Product
u·v = u^{T}·v
aka dot product
Length
v
= √v·v = √v_{1}^{2} + v_{2}^{2}...v_{n}^{2}
v
^{2} = v·v
orthogonal
if vectors u & v = 0
iff ||u+v||^{2} = ||u||^{2} + ||v||^{2}
Not every otthogonal set in R^{n} is linearly independent.
True. But every orthogonal set of nonzero vectors is linearly independent. See Theorem 4.
If a set S = {u_{1}...u_{p}} has the property that u_{i}·u_{j} = 0 whenever i != j, then S is an orthogonal set.
False. To be orthonormal, the vectors is S must be unit vectors as well as being orthogonal to each other.
If the columns of an m x n matrix A are orthnormal, then the linear mapping x --> Ax preserves lengths.
True. See Theorem 7(a).
The orthogonal proection of y onto v is the same as the orthogonal projection of y onto cv whenever c != 0.
True. See the paragraph before Example 3.
An orthogonal matrix is invertible.
True. See the paragraph before Example 7.
Suppose W is a subspace of R^{n} spanned by n nonzero orthogonal vectors. Explain why W = R^{n}.
A set of n nonzero orthogonal vectors must be linearly independent by Theorem 4, so if such a set spans W it is a basis for W. Thus W is an n-dimensional subspace of R^{n}, and W = R^{n}.
Orthongonal Basis
For a subspace W of R^{n} is a basis for W that is also an orthogonal set.
c_{1} = y ^{. }u_{1}/(u_{1}^{.} u_{1})
If z is orthogonal to u_{1} and to u_{2} and if W = Span {u_{1}, u_{2}} then z must be in Wperp.
True. See the calculations for 2 z in Example 1 or the box after Example 6 in Section 6.1.
For each y and each subspace W, the vector y - proj_{W}y is orthogonal to W.
True. See the Orthogonal Decomposition Theorem.
The orthogonal projection ŷ of y onto a subspace W can sometimes depend on the orthogonal basis for W used to compute ŷ.
False. See the last paragraph in the proof of Theorem 8, or see the second paragraph after thestatement of Theorem 9.
If y is in a subspace W, then the orthogonal projection of y onto W is y itself.
True. See the box before the Best Approximation Theorem.
If the columns of a n x p matrix U are orthonormal, then UU^{T}y is the orthogonal projection of y onto the column space of U.
True. Theorem 10 applies to the column space W of U because the columns of U are linearlyindependent and hence form a basis for W.
Orthogonal Projection
ŷ = ((y·u) / (u·u))*u
The distance from y to L.
Is the length of the perpendicular line segment from y to the orthofonal projection ŷ.
||y-ŷ|| = √(y-ŷ)^{2}
An m x n matrix U has orthonormal columns if and only if
U^{T}U = 1
Let U be an m x n matrix with orthonormal columns, and let x and y be in R^{n}
Then:
a. ||Ux|| =
b. (Ux)·(Uy) =
c. (Ux)·(Uy) =
a. ||x||
b. x·y
c. 0 if and only if x·y = 0
The general least-sqaures problem is to find an x that makes Ax as close as possible to b.
True. See the beginning of the section. The distance from Ax to b is || Ax– b||.
A least-squares solution of Ax=b is a vector ẋ that satifises Aẋ = ḃ, whereḃ is the orthogonal projection of b onto ColA.
True. See the comments about equation (1).
A least-squares solution of Ax=b is a vector ẋsuch that ||b-Ax|| <= ||b-Aẋ|| for all x in R^{n}.
False. The inequality points in the wrong direction. See the definition of a least-squares solution.
Any solution of A^{T}Ax=A^{T}b is a least-squares solution of Ax=b.
True. See Theorem 13.
If the columns of A are linearly independent, then the equations Ax=b has exactly one least-squares solution.