DOC PREVIEW
TAMU MATH 311 - Lect2-03web

This preview shows page 1-2-3-4-5 out of 14 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

MATH 311Topics in Applied MathematicsLecture 9:Linear independence.Basis of a vector space.Linear independenceDefinition. Let V be a vector space. Vectorsv1, v2, . . . , vk∈ V are called linearly dependent ifthey satisfy a relationr1v1+ r2v2+ · · · + rkvk= 0,where the coefficients r1, . . . , rk∈ R are not allequal to zero. Otherwise vectors v1, v2, . . . , vkarecalled linearly independent. That is, ifr1v1+r2v2+ · · · +rkvk= 0 =⇒ r1= · · · = rk= 0.An infinite set S ⊂ V is linearly dependent ifthere are some linearly dependent vectors v1, . . . , vk∈ S.Otherwise S is linearly independent.Examples of linear independence• Vectors e1= (1, 0, 0), e2= (0, 1, 0), ande3= (0, 0, 1) in R3.xe1+ ye2+ ze3= 0 =⇒ (x, y, z) = 0=⇒ x = y = z = 0• Matrices E11=1 00 0, E12=0 10 0,E21=0 01 0, and E22=0 00 1.aE11+ bE12+ cE21+ dE22= O =⇒a bc d= O=⇒ a = b = c = d = 0Examples of linear independence• Polynomials 1, x, x2, . . . , xn.a0+ a1x + a2x2+ · · · + anxn= 0 identically=⇒ ai= 0 for 0 ≤ i ≤ n• The infinite set {1, x, x2, . . . , xn, . . . }.• Polynomials p1(x) = 1, p2(x) = x − 1, andp3(x) = (x − 1)2.a1p1(x) + a2p2(x) + a3p3(x) = a1+ a2(x − 1) + a3(x − 1)2== (a1− a2+ a3) + (a2− 2a3)x + a3x2.Hence a1p1(x) + a2p2(x) + a3p3(x) = 0 identically=⇒ a1− a2+ a3= a2− 2a3= a3= 0=⇒ a1= a2= a3= 0Problem Let v1= (1, 2, 0), v2= (3, 1, 1), andv3= (4, −7, 3). Determine whether vectorsv2, v2, v3are linearly independent.We have to check if there exist r1, r2, r3∈ R not allzero such that r1v1+ r2v2+ r3v3= 0.This vector equation is equivalent to a systemr1+ 3r2+ 4r3= 02r1+ r2− 7r3= 00r1+ r2+ 3r3= 01 3 402 1 −700 1 3 0The vectors v1, v2, v3are linearly dependent if andonly if the matrix A = (v1, v2, v3) is singular.We obtain that det A = 0.Theorem The following conditions are equivalent:(i) vectors v1, . . . , vkare linearly dependent;(ii) one of vectors v1, . . . , vkis a linear combinationof the other k − 1 vectors.Proof: (i) =⇒ (ii) Suppose thatr1v1+ r2v2+ · · · + rkvk= 0,where ri6= 0 for some 1 ≤ i ≤ k. Thenvi= −r1riv1− · · · −ri−1rivi−1−ri+1rivi+1− · · · −rkrivk.(ii) =⇒ (i) Suppose thatvi= s1v1+ · · · + si−1vi−1+ si+1vi+1+ · · · + skvkfor some scalars sj. Thens1v1+ · · · + si−1vi−1− vi+ si+1vi+1+ · · · + skvk= 0.Theorem Vectors v1, v2, . . . , vm∈ Rnare linearlydependent whenever m > n (i.e., the number ofcoordinates is less than the number of vectors).Proof: Let vj= (a1j, a2j, . . . , anj) for j = 1, 2, . . . , m.Then the vector equality t1v1+ t2v2+ · · · + tmvm= 0is equivalent to the systema11t1+ a12t2+ · · · + a1mtm= 0,a21t1+ a22t2+ · · · + a2mtm= 0,· · · · · · · · ·an1t1+ an2t2+ · · · + anmtm= 0.Note that vectors v1, v2, . . . , vmare columns of the matrix(aij). The number of leading entries in the row echelon formis at most n. If m > n then there are free variables, thereforethe zero solution is not unique.Example. Consider vectors v1= (1, −1, 1),v2= (1, 0, 0), v3= (1, 1, 1), and v4= (1, 2, 4) in R3.Two vectors are linearly dependent if and only ifthey are parallel. Hence v1and v2are linearlyindependent.Vectors v1, v2, v3are linearly independent if andonly if the matrix A = (v1, v2, v3) is invertible.det A =1 1 1−1 0 11 0 1= −−1 11 1= 2 6= 0.Therefore v1, v2, v3are linearly independent.Four vectors in R3are always linearly dependent.Thus v1, v2, v3, v4are linearly dependent.Problem. Show that functions ex, e2x, and e3xare linearly independent in C∞(R).Suppose that aex+ be2x+ ce3x= 0 for all x ∈ R, wherea, b, c are constants. We have to show that a = b = c = 0.Differentiate this identity twice:aex+ be2x+ ce3x= 0,aex+ 2be2x+ 3ce3x= 0,aex+ 4be2x+ 9ce3x= 0.It follows that A(x)v = 0, whereA(x) =exe2xe3xex2e2x3e3xex4e2x9e3x, v =abc.A(x) =exe2xe3xex2e2x3e3xex4e2x9e3x, v =abc.det A(x) = ex1 e2xe3x1 2e2x3e3x1 4e2x9e3x= exe2x1 1 e3x1 2 3e3x1 4 9e3x= exe2xe3x1 1 11 2 31 4 9= e6x1 1 11 2 31 4 9= e6x1 1 10 1 21 4 9= e6x1 1 10 1 20 3 8= e6x1 23 8= 2e6x6= 0.Since the matrix A(x) is invertible, we obtainA(x)v = 0 =⇒ v = 0 =⇒ a = b = c = 0BasisDefinition. Let V be a vector space. A linearlyindependent spanning set for V is called a basis.Suppose that a set S ⊂ V is a basis for V .“Spanning set” means that any vector v ∈ V can berepresented as a linear combinationv = r1v1+ r2v2+ · · · + rkvk,where v1, . . . , vkare distinct vectors from S andr1, . . . , rk∈ R. “Linearly independent” implies that the aboverepresentation is unique:v = r1v1+ r2v2+ · · · + rkvk= r′1v1+ r′2v2+ · · · + r′kvk=⇒ (r1− r′1)v1+ (r2− r′2)v2+ · · · + (rk− r′k)vk= 0=⇒ r1− r′1= r2− r′2= . . . = rk− r′k= 0Examples. • Standard basis for Rn:e1= (1, 0, 0, . . . , 0, 0), e2= (0, 1, 0, . . . , 0, 0),. . . ,en= (0, 0, 0, . . . , 0, 1).Indeed, (x1, x2, . . . , xn) = x1e1+ x2e2+ · · · + xnen.• Matrices1 00 0,0 10 0,0 01 0,0 00 1form a basis for M2,2(R).a bc d= a1 00 0+ b0 10 0+ c0 01 0+ d0 00 1.• Polynomials 1, x, x2, . . . , xn−1form a basis forPn= {a0+ a1x + · · · + an−1xn−1: ai∈ R}.• The infinite set {1, x, x2, . . . , xn, . . . } is a basisfor P, the space of all polynomials.Bases for RnLet v1, v2, . . . , vkbe vectors in Rn.Theorem 1 If k < n then the vectorsv1, v2, . . . , vkdo not span Rn.Theorem 2 If k > n then the vectorsv1, v2, . . . , vkare linearly dependent.Theorem 3 If k = n t hen the following conditionsare equivalent:(i) {v1, v2, . . . , vn} is a basis for Rn;(ii) {v1, v2, . . . , vn} is a spanning set for Rn;(iii) {v1, v2, . . . , vn} is a linearly independent set.Example. Consider vectors v1= (1, −1, 1),v2= (1, 0, 0), v3= (1, 1, 1), and v4= (1, 2, 4) in R3.Vectors v1and v2are linearly independent (as theyare not parallel), but they do not span R3.Vectors v1, v2, v3are linearly independent since1 1 1−1 0 11 0 1= −−1 11 1= −(−2) = 2 6= 0.Therefore {v1, v2, v3} is a basis


View Full Document

TAMU MATH 311 - Lect2-03web

Download Lect2-03web
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lect2-03web and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lect2-03web 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?