Unformatted text preview:

151IX. The spectrum of a lm** point spectrum **An eigenpair for A ∈ L(X) is any (x, z) ∈ (X\0) × IF satisfying Ax = zx. Thescalar z in such an eigenpair is called an eigenvalue for A and the vector x is called aneigenvector for A belonging to the eigenvalue z. The collection of all eigenvectors of Abelonging to z is ker(A − z)\0. Thus z is an eigenvalue of A iff (A − z) fails to be 1-1.The collection of all eigenvalues of A is called the point spectrum of A and is denotedbyσP(A).Eigenstructure of a matrixIf X is finite-dimensional, then A is a matrix and one is naturally led to look into theeigenstructure of A when one looks for a basis V , i.e., an invertible lm V : IFn→ X,AX → XV ↑ ↑ VIFn→ IFnˆAfor which the corresponding matrix representationˆA = V−1AV for A is particularlysimple. Ideally, one wantsˆA to be a diagonal matrix. If there is such V , then A iscalled diagona(liza)ble. Assuming thatˆA is diagonal,ˆA = diagdz1, . . . , znc say, thennecessarily Avj= zjvj, all j, i.e., the basis V must consist of (nontrivial) eigenvectors ofA.Whether or notˆA is diagonal,ˆA is said to be similar to A.** who cares about eigenstructure? **If you look into the question as to why one might want a particularly simple matrixrepresentation for A in the first place, you will find that it is useful for understanding thepowers of A, of importance in the analysis of fixed-point iteration for solving linear (andnonlinear) systems, the solution of a system of first-order ODEs, and in the numericalsolution of evolution equations.For example, a square matrix A is powerbounded, i.e., {Ak: k ∈ IN} is a boundedset iff ∀{z ∈ σP(A)} |z| ≤ 1 with equality only if z is not defective, i.e., only if ran(A−z)∩ker(A − z) = {0}. Further, A is convergent, i.e., limk→∞Akexists iff ∀{z ∈ σP(A)} |z| ≤1 with equality only if z is not defective and z = 1. Finally, A is convergent to 0, i.e.,limk→∞Ak= 0 iff ∀{z ∈ σP(A)} |z| < 1 (as was mentioned already in Chapter 2 in thediscussion of fixed point iteration).polynomials in a lmc2002 Carl de Boor152 IX. The spectrum of a lm** polynomials in a lm **More generally, one is interested in understanding the behavior of linear combinationsPj≤ka(j)Ajof such powers, i.e., of polynomials p(A) in A (with p :=Pj()ja(j)), and,ultimately, of functions f(A) in A, to the extent that f can be approximated arbitrarilyclosely by polynomials p, hence f(A) can be understood as the limit of p(A) as p → f.E.g., y(t) = exp(tA)y0is the unique (vector-valued) solution at t of the first-order ODEDy = Ay with side condition y(0) = y0.Having a complete understanding of the eigenstructure of A vastly simplifies all deal-ings with p(A). Indeed, if A = VˆAV−1, then, for any p ∈ Π,p(A) = V p(ˆA)V−1,while, for a diagonal matrixˆA = diagd. . . , zj, . . .c,p(ˆA) = diagd. . . , p(zj), . . .c.Thus, for a diagonalizable A,σP(p(A)) = p(σP(A)).This is a particular example of the Spectral Mapping Theorem.Work with polynomials in the lm A is materially helped by the seemingly trivial factthat any two polynomials in the same linear map commute:(1) ∀{p, q ∈ Π; A ∈ L(X)} p(A)q(A) = q(A)p(A).H.P.(1) Prove (1).As an illustration, here is a proof of the basic fact that every A ∈ L(X) with 0 <dim X < ∞ and IF = Chas eigenvalues. Indeed, there is x ∈ X\0 and, for any suchx, [x, Ax, A2, . . . , Adim Xx] must fail to be 1-1, hence there is a 6= 0 so that p(A)x :=Pja(j)Ajx = 0, showing that p(A) fails to be 1-1, even though p 6= 0. Let d := max{j :a(j) 6= 0}. Then, wlog, a(d) = 1, i.e., p is monic. Further, d > 0 since x 6= 0. SinceIF = C, we can therefore write p as the productQj(· − zj) of d > 0 linear factors. But,since p(A) =Qj(A − zj) fails to be 1-1, at least one of the factors A − zjmust fail to be1-1.** A-invariant direct sum decompositions **As a start toward a simplest matrix representation, assume that P is a spectralprojector for the lm A, i.e., a lprojector that commutes with A,P A = AP.Then the corresponding direct sum decompositionX = X1˙+X2, X1:= ran P, X2:= ker Pis A-invariant in the sense that its summands are A-invariant,AXi⊂ Xi, all i.A-invariant direct sum decompositionsc2002 Carl de BoorEigenstructure of a matrix 153Conversely, for any such A-invariant direct sum decomposition X = X1˙+X2, the corre-sponding lprojector, given by ran P = X1, ker P = X2, is spectral for A since ran(AP ) =AX1⊆ X1= ran P , hence P AP = AP , while also ran(A(1 − P )) = AX2⊆ X2= ker P ,hence P A(1 − P ) = 0, therefore, altogether, P A = P AP + P A(1 − P ) = AP .If now X is finite-dimensional, then so are the Xi, and, with Viany basis for Xi, V :=[V1, V2] is a basis for X with the happy property thatˆA = V−1AV = [V1, V2]−1[AV1, AV2]is block-diagonal, in particular,ˆA =ˆA100ˆA2,ˆAi:= V−1iAXiVi,since the columns of AViare in Xi, hence their coordinates wrto V have nonzero entriesonly corresponding to the columns of Viin V .If you conclude from this that a search for ‘simple’ matrix representations for A ∈L(X) is equivalent to a search for A-invariant direct sum decompositions for A with manysummands or, equivalently, a search for a many-termed sequence (Pi) in L(X) with PiPj=δijand APi= PiA, all i, j, then you would be quite right.** primary decomposition **Here, for the record, is a first step in that direction that goes back to Frobenius. Tobe sure, this first step does not, in general, do the complete job. For that, just skip to theheading ‘A finest A-invariant direct sum decomposition’.Assuming dim X to be finite-dimensional, so is L(X), hence [Ar: r = 0, . . . , dim L(X)]cannot be 1-1, therefore there are polynomials p 6= 0 that annihilate A in the sense thatp(A) = 0. Let p be any such monic annihilating polynomial and assume, for simplicity,that IF = C. Thenp =:Yipi,with pi= (· − zi)mi, and zi6= zjfor i 6= j. It follows that the polynomials`i:= p/pi, all i,do not have a common zero, hence1 =Xi`ihifor certain polynomials hi. Indeed, let hibe the unique polynomial of degree < miforwhich `ihiagrees mi-fold with 1 at zi. Then 1−Pi`ihiis a polynomial of degree <Pimiand vanishes mi-fold at zi, all i, hence must be zero.SetPi:= `i(A)hi(A), Xi:= ker pi(A), all i.Then1 =XiPi,primary decompositionc2002 Carl de Boor154 IX. The spectrum of a lmand, for j 6= i, Pjvanishes on Xi(since then `jhas pias a factor), hence 1 = Pion Xi,while ran Pi⊆ Xi(since pi`ihihas p as a factor).


View Full Document

UW-Madison CS 717 - The Spectrum of a lm

Download The Spectrum of a lm
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view The Spectrum of a lm and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view The Spectrum of a lm 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?