-
System of Linear Equation Solutions
Every system of linear equations has either NO solutions, 1 solution, or an INFINITE number of solutions
-
Consistent
A system of equations (linear or not - over any space) is CONSISTENT if it is a NON-EMPTY set
-
Inconsistent
A system of equations (linear or not - over any space) is INCONSISTENT if it is an EMPTY set
-
Methods for solving systems of equations
1. Algebraically eliminate variables
2. Simultaneous elimination
3. Gaussian elimination using matrices
-
Row-Echelon Form (1-3)
Reduced Row-Echelon Form (1-4)
1. First NONzero entry of 1st row is a 1 (leading 1)
- 2. Any rows consisting entirely of zeros are grouped together at the bottom on matrix
- 3. First NON zero entry of 2nd row is a 1 (occuring farther to the right than leading 1 in higher row
*4. Each column containing a leading 1 has zeros everyhwere else
-
Homogeneous
A system of linear equations is said to be homogeneous if the constant terms are all zero; the system has the form:
-
Trivial Solution
Every HOMOGENEOUS system of linear equations is CONSISTENT (since all such systems have 0 as a solution. This is called the TRIVIAL solution
-
NON-trivial soluton
If there are any other solutions of a system of linear equations other than the trivial solutions, these are called NON-trivial solutions
-
Theorem:
- Homogeneous Systems
A homogeneous system of linear equations with more unknowns than equations has infinitely many solutions
-
Definition:
- Matrix
- Entries
A matrix is a rectangualr array of numbers
The numbers in the array are called entries
-
Column Matrix aka Column Vector
Row Matrix aka Row Vector
A column matrix is a matrix with only one column
A row matrix is a matrix with only one row
-
Definition:
- Equal Matrices
Two matrices are defined to be equal if they have the same size and their corresponding entries are equal
-
Theorem:
- Properties of a Zero Matrix
A matrix, all of whose entries are zero is called a zero matrix
1. A + 0 = 0 + A = A
2. A - A = 0
3. 0 - A = -A
4. A0 = 0; 0A = 0
-
Square matrix
A matrix with n rows and n columns is called a square matrix
-
Diagonal Matrix
A square matrix in whichn the entries outside the main diagonal are all zero
*tridiagonal matrices are also square matrices
-
Upper Triangular Matrix
A square matrix a, such that aij = 0 if i>j
-
Lower Triangular Matrix
A square matrix such that aij = 0 if i < j
-
Definition:
- Matrix Addition
Matrix Addition is the operation of adding two matrices by adding the corresponding entries together.
*matrices of different sizes cannot be added (or subtracted)
-
Definition:
- Matrix Multiplication
Matrix multiplication is a binary operation that takes a pair of matrices, and produces another matrix. The product c of matrices a and b is defined as:
cik = aij * bjk
*the dimensions of the matrices must satisfy (mxn)(nxp) = (mxp)
-
Definition:
- Matrix Transpose
The transpose of the (mxn) matrix a is the (nxm) matrix formed by interchaning the rows and columns such trhat row i becomes column i of the transposed matrix denoted by aT
-
Definition:
- Matrix Trace
The trace of an n-by-n square matrix A - denoted by tr(A) - is defined to be the sum of the elements on the main diagonal (the diagonal from the upper left to the lower right) of A
-
Theorem:
Properties of Matrix Arithmetic
1. A + B = B + A (commutative law of addition)
2. A + (B + C) = (A + B) + C (associative law of addition)
3. A(BC) = AB + AC (associative law of multiplication)
4. A(B + C) = AB + CA / A(B - C) = AB - CA(left distributive law)
5. (B + C)A = BA + CA / (B - C)A = BA - CA(right distributive law)
-
Theorem:
- Identity Matrices
If R is the rref of an nxn matrix A, then either R has a row of zeros or R is the identity matrix I
*Definiton - An identity matrix has the property that if A is a square matrix, then IA = AI = A
Inxn = [aij] where aij = 1 if i = j, aij = 0 if i does not equal j
-
Definition:
- Invertible
- Inverse
- Singular
If A is a square matrix, and if a matrix B of the same size can be found such that AB = BA = I, then A is said to be invertible and B is called an inverse of A denoted by A-1.
If no such matrix B can be found, then A is said to be singular (not invertible).
-
Theorem:
Properties of Inverses
If B and C are both inverses of the matrix A, then B = C.
- Proof:
- Since B is an inverse of A, we have BA = I.
- Multiplying both sides on the right by C gives (BA)C = IC = C.
- But (BA)C = B(AC) = BI = B, so that C =B
-
incomplete -Theorem:
- The matrix
- A = [ a b
- c d ]
- is invertible if ad - bc
-
Theorem:
- Invertible Matrices
- if A and B are invertible martices of the same size, then AB is invertible and
- (AB)-1 = B-1 A-1
- Proof:
- If we can show that (AB)(B-1A-1) = (B-1A-1)(AB) = I,
- then we will have simultaneously shown that the matrix AB is invertible and
- that (AB)-1 = B-1 A-1.
- But (AB)(B-1A-1) = A(BB-1)A-1 = AIA-1 = AA-1 =I.
*A similar argument shows that (B -1A -1)(AB) = I
-
Product of any # of Invertible Matrices
A product of any number of invertible matrices is invertible, and the inverse of the product is the product of the inverses in the reverse order.
|
|