Math 304–504Linear AlgebraLecture 13:Linear independence.Span: implicit definitionLet S be a subset of a vector space V .Definition. The span of the set S, denotedSpan(S), is the smallest subspace of V thatcontains S. That is,• Span(S) is a subspace of V ;• for any subspace W ⊂ V one hasS ⊂ W =⇒ Span(S) ⊂ W .Remark. The span of any set S ⊂ V is well defined(it is the intersection of all subspaces of V thatcontain S).Span: effective descriptionLet S be a subset of a vector space V .• If S = {v1, v2, . . . , vn} then Span(S) is the setof all linear combinations r1v1+ r2v2+ · · · + rnvn,where r1, r2, . . . , rn∈ R.• If S is an infinite set then Span(S) is the set ofall linear combinations r1u1+ r2u2+ · · · + rkuk,where u1, u2, . . . , uk∈ S and r1, r2, . . . , rk∈ R(k ≥ 1).• If S is the empty s et then Span(S) = {0}.Spanning setDefinition. A subset S o f a vector space V iscalled a spanning set for V if Span(S) = V .Examples.• Vectors e1= (1, 0, 0), e2= (0, 1, 0), ande3= (0, 0, 1) form a spanning set for R3as(x, y, z) = xe1+ ye2+ ze3.• Matrices1 00 0,0 10 0,0 01 0,0 00 1form a spanning set for M2,2(R) asa bc d= a1 00 0+ b0 10 0+ c0 01 0+ d0 00 1.Linear independenceDefinition. Let V be a vector space. Vectorsv1, v2, . . . , vk∈ V are called linearly dependent ifthey satisfy a relationr1v1+ r2v2+ · · · + rkvk= 0,where the coefficients r1, . . . , rk∈ R are not allequal to zero. Otherwise vectors v1, v2, . . . , vkarecalled linearly independent. T hat is, ifr1v1+r2v2+ · · · +rkvk= 0 =⇒ r1= · · · = rk= 0.An infinite set S ⊂ V is linearly dependent ifthere are some linearly dependent vectors v1, . . . , vk∈ S.Otherwise S is linearly independent.Theorem The following conditions are equiv alent:(i) vectors v1, . . . , vkare linearly dependent;(ii) one of vectors v1, . . . , vkis a linear combinationof the other k − 1 vectors.Proof: (i) =⇒ (ii) Suppose thatr1v1+ r2v2+ · · · + rkvk= 0,where ri6= 0 for some 1 ≤ i ≤ k. Thenvi= −r1riv1− · · · −ri −1rivi−1−ri +1rivi+1− · · · −rkrivk.(ii) =⇒ (i) Suppose thatvi= s1v1+ · · · + si−1vi−1+ si+1vi+1+ · · · + skvkfor some scalars sj. Thens1v1+ · · · + si−1vi−1− vi+ si+1vi+1+ · · · + skvk= 0.Examples of l inear independence• Vectors e1= (1, 0, 0), e2= (0, 1, 0), ande3= (0, 0, 1) in R3.xe1+ ye2+ ze3= 0 =⇒ (x, y, z) = 0=⇒ x = y = z = 0• Matrices E11=1 00 0, E12=0 10 0,E21=0 01 0, and E22=0 00 1.aE11+ bE12+ cE21+ dE22= O =⇒a bc d= O=⇒ a = b = c = d = 0Examples of l inear independence• Polynomi als 1, x, x2, . . . , xn.a0+ a1x + a2x2+ · · · + anxn= 0 identically=⇒ ai= 0 for 0 ≤ i ≤ n• The infinite set {1, x, x2, . . . , xn, . . . }.• Polynomi als p1(x) = 1, p2(x) = x − 1, andp3(x) = (x − 1)2.a1p1(x) + a2p2(x) + a3p3(x) = a1+ a2(x − 1) + a3(x − 1)2== (a1− a2+ a3) + (a2− 2a3)x + a3x2.Hence a1p1(x) + a2p2(x) + a3p3(x) = 0 identically=⇒ a1− a2+ a3= a2− 2a3= a3= 0=⇒ a1= a2= a3= 0Problem Let v1= (1, 2, 0), v2= (3, 1, 1), andv3= (4, −7, 3). Determine whether vectorsv2, v2, v3are linearly independent.We have to check if there exist r1, r2, r3∈ R not allzero such that r1v1+ r2v2+ r3v3= 0.This vector equation is equivalent to a systemr1+ 3r2+ 4r3= 02r1+ r2− 7r3= 00r1+ r2+ 3r3= 01 3 402 1 −700 1 30The vectors v1, v2, v3are linearly dependent if andonly if the matrix A = (v1, v2, v3) is singular.We obtain that det A = 0.Theorem Vectors v1, v2, . . . , vm∈ Rnare linearlydependent whenever m > n.Proof: Let vj= (a1j, a2j, . . . , anj) for j = 1, 2, . . . , m.Then the vector identity t1v1+ t2v2+ · · · + tmvm= 0is equivalent to the systema11t1+ a12t2+ · · · + a1mtm= 0,a21t1+ a22t2+ · · · + a2mtm= 0,· · · · · · · · ·an1t1+ an2t2+ · · · + anmtm= 0.Vectors v1, v2, . . . , vmare columns of the matrix (aij).If m > n then the system is under-determined,therefore the zero solution is not unique.Spanning sets and linear dependenceLet v0, v1, . . . , vkbe vectors from a vector space V .Proposition If v0is a linear co mbination of vectorsv1, . . . , vkthenSpan(v0, v1, . . . , vk) = Span(v1, . . . , vk).Indeed, if v0= r1v1+ · · · + rkvk, thent0v0+ t1v1+ · · · + tkvk== (t0r1+ t1)v1+ · · · + (t0rk+ tk)vk.Corollary Any spanning set for a vector space isminimal if and only if it is linearly independent.Proposition Functions 1, ex, and e−xare linearlyindependent.Proof: Suppose that a + bex+ ce−x= 0 for somea, b, c ∈ R. We have to show that a = b = c = 0.x = 0 =⇒ a + b + c = 0x = 1 =⇒ a + be + ce−1= 0x = −1 =⇒ a + be−1+ ce = 0The matrix of the system is A =1 1 11 e e−11 e−1e.det A = e2− e−2− 2e + 2e−1== (e − e−1)(e + e−1) − 2(e − e−1) == (e−e−1)(e+e−1−2) = (e−e−1)(e1/2−e−1/2)26= 0.Hence the system has a unique solution a = b = c = 0.Proposition Functions 1, ex, and e−xare linearlyindependent.Alternative proo f: Suppose thata + bex+ ce−x= 0 for some a, b, c ∈ R.Differentiate this identity twice:bex− ce−x= 0,bex+ ce−x= 0.It follows that bex= ce−x= 0 =⇒ b = c = 0.Then a = 0 as well.Theorem Let λ1, λ2, . . . , λkbe distinct realnumbers. Then the functions eλ1x, eλ2x, . . . , eλkxare linearly independent.Furthermore, the s et o f functions xmeλix,1 ≤ i ≤ k, m = 0, 1, 2, . . . i s also
View Full Document