Unformatted text preview:

Some definitions you already knew IE172 Algorithms in Systems Engineering Lecture 35 Vectors A1 A2 An are linearly dependent if the zero vector can be written as a non zero linear combination of the vectors That is 1 2 n not all equal to zero such that Pietro Belotti n X Dept of Industrial and Systems Engineering Lehigh University April 21 2009 More definitions you already knew j Aj 0 j 1 Alternatively if Aj are columns of a matrix A then the Aj are linearly dependent if Az 0 for some z 6 0 More Definitions The Range of a matrix A Rm n denoted R A is the set of all linear combinations of the columns of A Thus R A Rm Vectors A1 A2 An are linearly independent if the zero vector cannot be written as a non zero linear combination of the vectors n X j Aj 0 1 2 n 0 j 1 Alternatively if Aj are columns of a matrix A then the Aj are linearly independent if Az 0 z 0 i e 0 is the only solution to Az 0 b R A x Rn with Ax b The range is sometimes called the column space of A R A Rn is sometimes called the row space of A Two vectors x y are orthogonal if x y 0 The set of all m dimensional vectors orthogonal to vectors in R A is the null space of A z N A A z 0 z b z Ax x A z 0 if b R A z N A Likewise the set of n dimensional vectors orthogonal to the vectors in R A is the null space of A Matrix Identities A square matrix A Rn n whose columns are linearly independent is called nonsingular A is nonsingular if Ax 0 x 0 If A is nonsingular then Ax b has a unique solution Proof Ax b Ay b A x y 0 therefore x y 0 otherwise A is not nonsingular For a nonsingular matrix A its inverse A 1 A inverse is such that A 1 Ax x x Rn or AA 1 I A 1 A Linear Equations A linear equation in n variables x1 xn is an equation of the form a1 x1 a2 x2 an xn b where a1 a2 an and b are constants A solution to the equation is an assignment of values to the variables such that the equation is satisfied Suppose we interpret the constants a1 a2 an as the entries of an n dimensional vector a Let s also make a vector x out of the variables x1 x2 xn Then we can rewrite the above equation as simply a x b Uniqueness There is only one inverse Proof If B and C were both inverses B BI B AC BA C C A square matrix whose columns all have length norm 1 and that are pairwise orthogonal is called orthogonal If Q Rn n is orthogonal then by definition Q Q I so then Q Q 1 Systems of Linear Equations Suppose we are given a set of n variables whose values must satisfy more than one equation In this case we have a system of equations such as Matrix Notation Now we can interpret the constants aij as the entries of a matrix A and the constants b1 bm as the entries of a vector b a11 x1 a12 x2 a1n xn b1 a21 x1 a22 x2 a2n xn b2 Ax b am1 x1 am2 x2 amn xn bm where aij is a constant for all 1 i m and 1 j n and b1 bm are constants A solution to this system of equations is an assignment of values to the variables such that all equations are satisfied Interpreting the variables x1 xn as a vector we can again write the system of equation simply as We know that the system of equations Ax b has a unique solution if and only if the matrix A is square and invertible i e if the columns Aj are linearly independent From now on we will consider only such systems How do we solve a system of equations Special Matrices Fun with permutations A square matrix D is diagonal if dij 0 for i 6 j A square matrix L is lower triangular if lij 0 for j i A square matrix U is upper triangular if uij 0 for j i A square matrix P is a permutation matrix if there is a single 1 in each row and in each column all the rest is 0 The identity matrix I is diagonal and a permutation matrix 4 0 D 0 0 7 0 U 0 0 0 12 0 0 1 1 0 0 0 0 0 0 L 0 0 0 2 3 0 1 0 9 0 P 2 1 5 0 5 1 0 1 0 0 0 6 0 6 0 0 0 1 0 0 3 7 0 0 1 0 0 0 0 2 1 0 0 0 The LUP Decomposition Let s suppose that we are able to find three n n matrices L U and P such that What if we right multiply A by a permutation matrix 0 1 PA 0 0 0 0 0 1 7 4 1 5 1 0 0 3 0 0 1 0 0 0 1 0 9 6 1 2 1 0 7 0 3 3 5 4 1 4 0 1 2 9 1 6 4 1 3 0 9 6 1 2 7 5 3 0 Rows are shuffled Pij 1 means row j moves to i What if we left multiply A by P 7 4 5 1 AP 3 0 0 1 9 6 1 2 0 1 1 0 3 0 0 4 0 0 0 1 0 0 1 0 4 1 1 0 0 0 1 0 1 0 3 4 Columns are shuffled What if we left multiply A by P 7 4 5 1 AP 3 0 0 1 9 6 1 2 0 1 0 0 3 0 1 4 1 0 0 0 0 0 1 0 0 1 0 1 0 3 0 4 7 5 3 0 9 4 6 1 1 0 2 1 How do we use it to solve Ax b Note that the system PAx Pb is equivalent to the original system which is then equivalent to PA LU LUx Pb where We can solve the system in two steps L is upper triangular U is lower triangular with ones on the diagonal First solve the system Ly Pb forward substitution P is a permutation matrix Then solve the system Ux y backward substitution A solution x y to both gives L Ux Pb This is called an LUP decomposition of A We can use this decomposition to solve the system Ax b Forward substitution solving Ly Pb How does this help us First remember that Pb is really nothing more than a permuted version of b y1 l21 y1 A permutation matrix P is compactly represented …


View Full Document

Clemson IE 172 - Lecture 35

Download Lecture 35
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture 35 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 35 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?