DOC PREVIEW
TAMU MATH 304 - Lect3-02web

This preview shows page 1-2-3-4-5-6 out of 17 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 17 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 17 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 17 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 17 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 17 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 17 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 17 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

MATH 304Linear AlgebraLecture 24:Orthogonal complement.Orthogonal projection.Euclidean structureEuclidean structure in Rnincludes:• length of a vector: |x|,• angle between vectors: θ,• dot product: x · y = |x||y| cos θ.A BCθyxLength and distanceDefinition. The length of a vectorv = (v1, v2, . . . , vn) ∈ Rniskvk =pv21+ v22+ ··· + v2n.The distance between vectors/points x and y isky − xk.Properties of length:kxk ≥ 0, kxk = 0 only if x = 0 (positivity)krxk = |r|kxk (homogeneity)kx + yk ≤ kxk + kyk (triangle inequality)Scalar productDefinition. The scalar product of vectorsx = (x1, x2, . . . , xn) and y = (y1, y2, . . . , yn) isx · y = x1y1+ x2y2+ ··· + xnyn.Properties of scalar product:x · x ≥ 0, x · x = 0 only if x = 0 (positivity)x · y = y ·x (symmetry)(x + y) · z = x · z + y ·z (distributive law)(rx) · y = r (x · y) (homogeneity)In particular, x · y is a bilinear f unction (i.e., it isboth a linear function of x and a linear function of y).AngleCauchy-Schwarz inequality: |x · y| ≤ kxkkyk.By the Cauchy-Schwarz inequality, for any nonzerovectors x, y ∈ Rnwe havecos θ =x · ykxkkykfor a unique 0 ≤ θ ≤ π.θ is called the angle between the vectors x and y.The vectors x and y are said to be orthogonal(denoted x ⊥ y) if x · y = 0 (i.e., if θ = 90o).OrthogonalityDefinition 1. Vectors x, y ∈ Rnare said to beorthogonal (denoted x ⊥ y) ifx · y = 0.Definition 2. A vector x ∈ Rnis said to beorthogonal to a nonempty set Y ⊂ Rn(denotedx ⊥ Y ) if x ·y = 0 for any y ∈ Y .Definition 3. Nonempty sets X , Y ⊂ Rnare saidto be orthogonal (denoted X ⊥ Y ) if x · y = 0for any x ∈ X and y ∈ Y .Proposition 1 If X , Y ∈ Rnare orthogonal setsthen either they are disjoint or X ∩ Y = {0}.Proof: v ∈ X ∩ Y =⇒ v ⊥ v =⇒ v ·v = 0 =⇒ v = 0.Proposition 2 Let V be a subspace of Rnand Sbe a spanning set for V . Then for any x ∈ Rnx ⊥ S =⇒ x ⊥ V .Proof: Any v ∈ V is represented as v = a1v1+ ··· + akvk,where vi∈ S and ai∈ R. If x ⊥ S thenx · v = a1(x · v1) + ··· + ak(x · vk) = 0 =⇒ x ⊥ v.Example. The vector v = (1, 1, 1) is orthogonal tothe plane spanned by vectors w1= (2, −3, 1) andw2= (0, 1, −1) (because v · w1= v · w2= 0).Orthogonal complementDefinition. Let S ⊂ Rn. The orthogonalcomplement of S, denoted S⊥, is the set of allvectors x ∈ Rnthat are orthogonal to S. That is,S⊥is the largest subset of Rnorthogonal to S.Theorem 1 S⊥is a subspace of Rn.Note that S ⊂ (S⊥)⊥, hence Span(S) ⊂ (S⊥)⊥.Theorem 2 (S⊥)⊥= Span(S). In particular, forany subspace V we have (V⊥)⊥= V .Example. Consider a line L = {(x, 0, 0) | x ∈ R}and a plane Π = {(0, y , z) | y, z ∈ R} in R3.Then L⊥= Π and Π⊥= L.VV⊥0Fundamental subspacesDefinition. Given an m×n matrix A, letN(A) = {x ∈ Rn| Ax = 0},R(A) = {b ∈ Rm| b = Ax for some x ∈ Rn}.R(A) is the range of a linear mapping L : Rn→ Rm,L(x) = Ax. N(A) is the kernel of L.Also, N(A) is the nullspace of the matrix A whileR(A) is the column space of A. The row space ofA is R(AT).The subspaces N(A), R(AT) ⊂ RnandR(A), N(AT) ⊂ Rmare fundamental subspacesassociated to the matrix A.Theorem N(A) = R(AT)⊥, N(AT) = R(A)⊥.That is, the nullspace of a matrix is the orthogonalcomplement of its row space.Proof: The equality Ax = 0 means that the vector x isorthogonal to rows of the matrix A. Therefore N(A) = S⊥,where S is the s et of rows of A. It remains to note thatS⊥= Span(S)⊥= R(AT)⊥.Corollary Let V be a subspace of Rn. Thendim V + dim V⊥= n.Proof: Pick a basis v1, . . . , vkfor V . Let A be the k×nmatrix whose rows are vectors v1, . . . , vk. Then V = R(AT),hence V⊥= N(A). Consequently, dim V and dim V⊥arerank and nullity of A. Therefore dim V + dim V⊥equals thenumber of columns of A, which is n.Problem. Let V be the plane spanned by vectorsv1= (1, 1, 0) and v2= (0, 1, 1). Find V⊥.The orthogonal complement to V is the same as theorthogonal complement of the set {v1, v2}. A vectoru = (x, y, z) belongs to the latter if and only ifu · v1= 0u · v2= 0⇐⇒x + y = 0y + z = 0Alternatively, the subspace V is the row space of the matrixA =1 1 00 1 1,hence V⊥is the nullspace of A.The general solution of the system (or, equivalently, thegeneral element of the nullspace of A) is (t, −t, t)= t(1, −1, 1), t ∈ R. Thus V⊥is the straight line spannedby the vector (1, −1, 1).Orthogonal projectionTheorem 1 Let V be a subspace of Rn. Thenany vector x ∈ Rnis uniquely represented asx = p + o, where p ∈ V and o ∈ V⊥.Idea of the proof: Let v1, . . . , vkbe a basis for V andw1, . . . , wmbe a basis for V⊥. Then v1, . . . , vk, w1, . . . , wmis a basis for Rn.In the above expansion, p is called the orthogonalprojection of the vector x onto the subspace V .Theorem 2 kx − vk > kx − pk for any v 6= p in V .Thus kok = kx − pk = minv∈Vkx − vk is thedistance from the vector x to the subspace V .VV⊥opxOrthogonal projection onto a vectorLet x, y ∈ Rn, with y 6= 0.Then there exists a unique decomposition x = p + osuch that p is parallel to y and o is orthogonal to y.ypxop = orthogonal projection of x ont o yOrthogonal projection onto a vectorLet x, y ∈ Rn, with y 6= 0.Then there exists a unique decomposition x = p + osuch that p is parallel to y and o is orthogonal to y.We have p = αy for some α ∈ R. Then0 = o · y = (x − αy) · y = x · y − αy · y.=⇒ α =x · yy · y=⇒p =x · yy · yyProblem. Find the distance from the pointx = (3, 1) to the line spanned by y = (2, −1).Consider the decomposition x = p + o, where p is parallel toy while o ⊥ y. The required distance is the length of theorthogonal component o.p =x · yy · yy =55(2, −1) = (2, −1),o = x − p = (3, 1) − (2, −1) = (1, 2), kok =√5.Problem. Find the point on the line y = −x thatis closest to the point (3, 4).The required point is the …


View Full Document

TAMU MATH 304 - Lect3-02web

Documents in this Course
quiz1

quiz1

2 pages

4-2

4-2

6 pages

5-6

5-6

7 pages

Lecture 9

Lecture 9

20 pages

lecture 8

lecture 8

17 pages

5-4

5-4

5 pages

Load more
Download Lect3-02web
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lect3-02web and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lect3-02web 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?