Textbook Readings Unit Two Graphs and Networks 8 2 The incidence matrix of a graph tells how the n nodes are connected by the m edges Normally there are more edges than nodes m n Every entry of an incidence matrix is 0 1 or 1 The incidence matrix takes differences in voltage across the edges of a graph The voltages are x x x Equal voltages produce no current Row numbers are edge numbers Column numbers are node numbers A graph is complete when every pair of nodes is connected by an edge A tree is a graph that has no closed loops Elimination reduces every graph to a tree The rows of B match the nonzero rows of U The maximum number of edges is n n 1 The minimum number of edges is m n 1 Rows are dependent when edges form a loop Independent rows come from trees V is in the row space if and only if it is perpendicular to the null space In this case the null space is the vector 1 1 1 1 The small loops give a basis for the left null space Voltage law the components of Ax add to zero around every loop Current Law A y 0 is solved by loop currents The left null space has dimension m r This means that there are m r independent loops in the graph Euler s formula number of nodes number of edges number of small loops 1 Orthogonality 4 1 Two vectors are orthogonal when their dot product is zero The row space is perpendicular to the null space Every row of A is perpendicular to every solution of Ax 0 Review of the Key Ideas Subspaces V and W are orthogonal if every v in V is orthogonal to every w in W V and W are orthogonal complements if W contains all vectors perpendicular to V Inside the n dimensional space the dimensions of the complements V and W add to n The null space is the orthogonal complement of the row space The left null space is the orthogonal complement of the column space Projections and Subspaces 4 2 What are the projections of b 2 3 4 onto the z axis and the xy plane When b is projected onto a line its projection p is the part of b along that line When b is projected onto a plane p is the part in that plane The projection p is Pb The line and plane are orthogonal complements Their dimensions add to the dimension of the whole space Every vector b in the whole space is the sum of its parts in the two subspaces The vectors give p p b The matrices give P P I Projection Onto a Line The key to projection is orthogonality the line from b to p is perpendicular to the vector a The error line b p is equal to e b xa Calculating x hat Special case one if b a then x 1 This means that the projection of a onto a is itself Pa a Special case two If b is perpendicular to a then a b 0 This means the projection is p 0 The projection p of b onto a line and onto S is equal to the column space of A The column space is perpendicular to the left null space De nition Two subspaces V and W of a vector are orthogonal if every vector v in V is perpendicular to every vector w in W De nition The orthogonal complement of a subspace V contains every vector that is perpendicular to V The null space is the orthogonal complement of the row space in R The left null space is the orthogonal complement of the column space in R Calculating the projection matrix If the vector a is doubled the matrix P stays the same If the matrix P is squared it still equals P This is because projecting a second time doesn t change anything Note I P b equals b p which is e in the left null space When P projects onto one subspace I P projects onto the perpendicular subspace Computing a projection matrix Find x hat Find the vector p Find the matrix P Projecting Onto a Subspace We compute the projection matrix using the same steps as before The error vector is b Ax The projection of b onto the subspace is p Calculating the projection matrix Linear Algebra gives us the normal equation A b Ax 0 Our subspace is the column space of A The error vector b Ax is perpendicular to the column space Therefore b Ax is in the left null space A A is invertible if and only if A has linearly independent columns Review of the Key Ideas The projection of b onto the line through a is p ax The rank one projection matrix P multiplies b to produce p p Pb Projecting b onto a subspace leaves e b p perpendicular to the subspace When A has full rank n the equation A Ax A b leads to x and p Ax The projection matrix P A A A A has P P and P P Least Squares Approximations 4 3 The least squares solution is x When Ax b has no solution multiply by A and solve Example Find the closest line to the point 0 6 1 0 and 2 0 Use the equation C Dt b for each value of t For t 0 C D 0 6 For t 1 C D 1 0 For t 2 C D 2 0 This means that A b Ax 0 The left null space is important in projections because it contains the error vector b Ax Create the matrix A using the coef cients of C for the rst row and the coef cients of D for the second row Solve using Minimizing error by geometry Every Ax lies in the plane of the columns of A In that plane we look for the point closest to b This is the projection p The best choice for Ax is p The smallest possible error is e b p x gives the best choice for C D Minimizing error by algebra Every vector b splits into two parts The part of b in the column space is p The part of b in the left null space is e We can solve for and the solution is the least possible error p The smallest possible error means that the squared length of Ax b is minimized The least squares solution x makes E Ax b as small as possible The least squares line minimizes E e e e The error e e e e is perpendicular to the rst column in A The dot product gives e e e 0 Minimizing error by calculus The error function E to be minimized is a sum of squares E e e e The unknowns are C and D There are two partial derivatives both of which are zero at their minimum Fitting a Straight Line …
View Full Document