Lin Algebra

  1. In order for a matric B to be the inverse of A, both equations AB=I and BA=I must be true
    True, by definition of invertible.
  2. If A and B are n x n and invertible, the A-1B-1 is the inverse of AB.
    • False.
    • (AB)-1 = B-1A-1
  3. if A = [a b / c d] and ab-cd != 0, then A is invertible.
    False. If A[1 1 / 0 0], then ab – cd = 1 – 0 ≠ 0, but Theorem 4 shows that this matrix is not invertible, because ad – bc = 0.
  4. Each elementary matrix is invertible.
    True
  5. If the equation Ax=0 has only the trivial soluation, then A is row equivalent to the n x n matrix.
    True, by the IMT. If statement (d) of the IMT is true, then so is statement (b).
  6. If the columns of A span Rn, then the columns are linearly independent.
    True. If statement (h) of the IMT is true, then so is statement (e).
  7. If A is an n x n matrix, the the equation Ax=b has at least one solution for each b in Rn.
    False. Statement (g) of the IMT is true only for invertible matrices.
  8. If AT is not invertible, then A is not invertible.
    True, by the IMT. If AT is not invertible, then statement (1) of the IMT is false, and hence statement(a) must also be false.
  9. If the equation Ax=0 has a nontrivial solution, then A has fewer than n pivot positions.
    True, by the IMT. If the equation Ax = 0 has a nontrivial solution, then statement (d) of the IMT is false. In this case, all the lettered statements in the IMT are false, including statement (c), which means that A must have fewer than n pivot positions.
  10. If there is an n x n matrix D such that AD=I, then there is also an nxn matrix C such that CA=I.
    True. If statement (k) of the IMT is true, then so is statement ( j).
  11. If the columns of A are linearly independent, then the columns of A span Rn.
    True. If statement (e) of the IMT is true, then so is statement (h).
  12. If the equation Ax=b has at least one solution for each b in Rn, then the solution is unique for each b.
    True. See the remark immediately following the proof of the IMT.
  13. If the linear transformation x-->Ax maps Rn into Rn, then A has n pivot positions.
    False. The first part of the statement is not part (i) of the IMT. In fact, if A is any n×n matrix, the linear transformation x->Ax maps Rn into Rn, yet not every such matrix has n pivot positions.
  14. If C is 6x6 and the equation Cx=v is consistent for every v in R6, is it possible that for some v, the equation Cx=v has more than one solution?
    By (g) of the IMT, C is invertible. Hence, each equation Cx = v has a unique solution.
  15. If nxn matrices E and F have the property that EF=I, then E and F commute.
    By the box following the IMT, E and F are invertible and are inverses. So FE = I = EF, and so E and Fcommute.
  16. If the equation Gx=y has more than one solution for some y in Rn, can the columns of G span Rn?
    The matrix G cannot be invertible, So (h)of the IMT is false and the columns of G do not span Rn.
  17. If L is a nxn and the equation Lx=0 has the trivial soluation, do the columns of L span Rn?
    No conclusion about the columns of L may be drawn, because no information about L has been given.The equation Lx = 0 always has the trivial solution.
  18. The Invertible Matrix Thereom
    Let A be a nxn matrix then these statements are either all true or all false.

    • a. A is an invertible matrix.
    • b. A is a row equivalent to the nxn identity matrix.
    • c. A has n pivot positions.
    • d. The equation Ax=o has only the trivial solution.
    • e. The columns of A form a linearly independent set.
    • f. The linear transformation x--> Ax is one-to-one.
    • g. The equation Ax=b has at least one solution for each b in Rn.
    • h. The columns of A span Rn.
    • i. The linear transformation x--> Ax maps Rn onto Rn.
    • j. There is an nxn matrix C such that CA=I.
    • k. There is an nxn matrix D such that AD=I.
    • l. AT is an invertible matrix.
  19. A subspace of Rn is any set H such that (i) the zero vector is in H, (ii) u, v, and u+v are in H, and (iii) c is a scalar and cu is in H.
    False. See the definition at the beginning of the section. The critical phrases “for each” are missing.
  20. If v1,....vp are in Rn, then Span{v1....,vp} is the same as the column space of the matrix [v1...vp].
    True. See the paragraph before Example 4.
  21. The set of all solutions of a system of m homogeneous equations in n unknowns is a subspace of Rm.
    False. See Theorem 12. The null space is a subspace of Rn, not Rm.
  22. Row operations do not affect linear dependence relations among the columns of a matrix.
    True. See the first part of the solution of Example 8.
  23. A subset H of Rn is a subspace if the zero vector is in H.
    False. See the definition at the beginning of the section. The condition about the zero vector is only one of the conditions for a subspace.
  24. Given vectors v1...,vp in Rn, the set of all linear combinations of these vectors is a subspace of Rn.
    True. See Example 3.
  25. The null space of an m x n matrix is a subspace of Rn.
    True. See Theorem 12.
  26. The column space of a matrix A is the set of solutions of Ax=b.
    False. See the paragraph after Example 4.
  27. If B is an echelon form of a matrix A, then the pivot columns of B form a basis for Col A.
    False. See the Warning that follows Theorem 13.
  28. Suppose a 3x5 matrix A has three pivot columns. Is Col A=R3? Is Nul A=R2?
    Col A = R3, because A has a pivot in each row and so the columns of A span R3. Nul A cannot equal R2,because Nul A is a subspace of R5. It is true, however, that Nul A is two-dimensional. Reason: the equation Ax = 0 has two free variables, because A has five columns and only three of them are pivot columns.
  29. The columns of an invertible nxn matrix form a basis for Rn.
    True. See Example 5.
  30. An n x n determinant is defined by determinants of
    (n-1)x(n-1) submatrices.
    True. See the paragraph preceding the definition of the determinant.
  31. The (i,j)-cofactor of a matrix A is the matrix Aij obtained by deleting from A its ith row and jth column.
    False. See the definition of cofactor, which precedes Theorem 1.
  32. The cofactor expansion of det A down a column is the negative of the cofactor expansion along a row.
    False. See Theorem 1.
  33. The determinant of a triangular matrix is the sum of the entries on the main diagonal.
    False. See Theorem 2.
  34. If B={v1...,vp} is a basis for a subspace H and if x=c1v1+....+cpvp, then c1....,cp are the coordinates of x relative to the basis B.
    True. This is the definition of a B-coordinate vector.
  35. Each line in Rn is a one-dimensional subspace of Rn.
    False. Dimension is defined only for a subspace. A line must be through the origin in Rn to be a subspace of Rn.
  36. The dimension of Col A is the number of pivot columns of A.
    True. The sentence before Example 1 concludes that the number of pivot columns of A is the rank of A, which is the dimension of Col A by definition.
  37. The dimensions of Col A and Nul A add up to the number of columns of A.
    True. This is equivalent to the Rank Theorem because rank A is the dimension of Col A.
  38. If a set of p vectors spans a p-dimensional subspace H of Rn, then these vectors form a basis for H.
    True, by the Basis Theorem. In this case, the spanning set is automatically a linearly independent set.
  39. If B is a basis for a subspace H, then each vector H can be written in only one way as a linear combination of the vectors in B.
    True. This fact is justified in the second paragraph of this section.
  40. If B={v1....,vp} is a basis for a subspace H of Rn, then the correspondence x-->[x]B makes H look and act the same as Rp.
    True. See the second paragraph after Fig. 1.
  41. The dimension of Nul A is the number of variables in the equation Ax=0.
    False. The dimension of Nul A is the number of free variables in the equation Ax = 0.
  42. The dimension of the column space of A is rank A.
    True, by the definition of rank.
  43. If H is a p-dimensional subspace of Rn, then a linearly independent set of p vectors in H is a basis for H.
    True, by the Basis Theorem. In this case, the linearly independent set is automatically a spanning set.
  44. A row replacement operation does not affect the determinant of a matrix.
    True. See Theorem 3.
  45. The determinant of A is the product of the pivots in any echelon form U of A, multiplied by (-1)r, where r is the number of row interchanges made during row reduction from A to U.
    True. See the paragraph following Example 2.
  46. If the columns of A are linearly dependent, the det A=0.
    True. See the paragraph following Theorem 4.
  47. det(A+B)=det A+det B
    False. See the warning following Example 5.
  48. If two row interchanges are made in succession, then the new determinant equals the old determinant.
    True. See Theorem 3.
  49. The determinant of A is the product of the diagonal entries in A.
    True.
  50. If det A is zero , then two rows or two columns are the same, or a row or a column is zero.
    False. See Example 3.
  51. det AT = (-1)det A
    False. See Theorem 5.
Author
mechtech2081
ID
109366
Card Set
Lin Algebra
Description
Linear Algebra
Updated