DOC PREVIEW
Berkeley MATH 54 - MATH 54 Midterm Review

This preview shows page 1 out of 4 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 4 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 4 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Math 54, Spring 2009, Sections 109 and 112Midterm 2 ReviewThis sheet mentions a lot of the major ideas from Chapters 4, 5 and 6. It is inevitablyinexhaustive, but hopefully it can help you notice some areas where you might need toreview some more.Rnvs. Vector spaces• In Rn, we had defined operators of scalar multiplication and addition. Inspired bythis, we defined a vector space to be any set of objects that have addition and scalarmultiplication operations that behave like those in Rn, with the full list of axioms givenon p.217.• We can then define the concepts of subspaces, spanning and linear independencethe same way we did for Rn. Note: if H is a subspace of V , then H is again a vectorspace, with the same operations as V .• Just like with vector spaces, a basis is a linearly independent spanning set. However,not all vector spaces have finite bases. A vector space with a finite basis is calledfinite-dimensional. All bases for a given finite-dimensional vector space have thesame number of elements.• Any linearly independent set in a vector space can be expanded to a basis by addingmore elements. Any spannig set can be contracted to a basis by removing redundentelements. To do this, order your spanning set, and keep removing vectors that can bewritten as linear combinations of the ones before.• Informally speaking, any (finite-dimensional) vector space with dimension n looks andfeels like Rn. What is the formal version of this statement? If B is a basis for V , thenthe coordinate map [·]Bis an invertible linear transformation (isomorphism) betweenV and Rn. This isomorphism can be used to prove that V shares many of the sameproperties as Rn(p.250-251).• If B and C are different bases for V , we may be interested in the relationship between[~x]Band [~x]C. For any pair of bases, there is a unique, invertible matrix PC←Bsuch thatPC←B[~x]B= [~x]Cfor all ~x ∈ V (p.273).1• More generally, if T : V → W is a linear transformation, B is a basis for V , and C is abasis for W , then there is a unique matrix M such that M[~x]B= [T (~x)]C(p. 329). IfV = W , then M = PC←B. Since V and W are just Rnand Rmin disguise, you can thinkof M as doing the same thing as T , just on the undisguised versions of V and W .• If V = W and B = C in the previous bullet point, then the matrix M is called [T ]B,the B-matrix of T . Note: this is the same notation as coordinates, but this is different;this does not mean we are taking the coordinates of a matrix. However, we do have[T (~x)]B= [T ]B[~x]Bif T : V → V is a linear transformation.Eigenvectors and eigenvalues• An eigenvector/eigenvalue pair for a matrix A is a non-zero vector x and a scalarλ such that A~x = λ~x. The eigenspace of a matrix A with respect to the eigenvalueλ is the set of all eigenvectors of A with eigenvalue λ, along with the zero vector.Alternatively, it is the subspace Nul(A − λI).• If a matrix is triangular (or diagonal), the eigenvalues are the entries on the diag-onal. If not, you can find the eigenvalues by finding the roots of the characterisicpolynomial det(A − λI).• Similar matrices have the same eigenvalues. A and Athave the same eigenvalues.(Can you prove these things?)• If A is n × n, and the dimensions of the eigenspaces of A add up to n, then A isdiagonalizable (Theorem 7, p.324). That is, there is an invertible matrix P anddiagonal matrix D such that A = P DP−1.• If an n × n matrix has n different eigenvalues, then it is diagonalizable (since everyeigenspace has dimension at least 1). However, the converse is not true. The matrix2I has only one eigenvalue, 2, but it is diagonal(izable).Orthogonality and related ideas• The existence of an inner product (the dot product) on Rnlets us define the notionsof orthogonal vectors (where ~x ···~y = 0) and norm of vectors kxk =√~x ·~x. If~x ·~y = 0, then k~x + ~yk2= k~xk2+ k~yk2(the Pythagorean Theorem in n-dimensions).2• A set S is called orthogonal if ~x and ~y are orthogonal for every pair of distinct vectors~x, ~y ∈ S. Every orthogonal set is linearly independent.• We’re particularly interested in orthogonal bases and orthonormal bases (wherek~xk = 1 for every basis vector. Note: you can turn an orthogonal basis into anorthonormal basis by dividing every basis vector by its length). To turn an ordinarybasis into an orthogonal basis, use Gram-Schmidt (p.402-).• If W is a subspace of Rn, then we define the orthogonal complement W⊥to beeverything in Rnthat is orthogonal to everything in W . Given any vector ~y ∈ Rn, itcan be written uniquely in the form ~y = ˆy + ~z, where ProjW~y = ˆy ∈ W and ~z ∈ W⊥.This can be calculated via Theorem 8 (p. 395) if you have an orthogonal basis for W .• The vector ˆy from the previous bullet is the closest point in W to ~y (Theorem 9,p.398).• One use of the previous fact is that it allows us to find ~x that makesA~x −~bas smallas possible for a given matrix A and~b. If A~x =~b is consistent, then we just want tosolve A~x =~b. If not, calculate ProjCol A~b, and solve A~x = ProjCol A~b instead (p.414).Alternatively, one can solve the normal equations ATA~x = AT~b (p.411).• Just as we have generalized many notions from Rnto vector spaces in general, wedefine the notion of inner product on a vector space to be anything that has someof the same properties as the dot product (p.428 for the list of axioms).• This allows us to define length and orthogonality in a vector space. However, theseconcepts depend on the particular inner product chosen. In general, there are infinitelydifferent inner products that can be defined on a single vector space, so there is no“correct” notion of length or orthogonality on a given vector space unless there is a“correct” or “standard” inner product for that vector space (like the dot product forRn, which gives us the expected notions of length and orthogonality based on ourintuition regarding the world around us).• Given an inner product h·, ·i on a vector space V , and k~xk =ph~x, ~xi, we have thefollowing two inequalities (p432-433):|h~x, ~yi| ≤ kxkkyk, k~x + ~yk ≤ k~xk + k~yk.3Some things you should know how to do• Determine if a set of vectors in a vector space V is linearly independent (writing vectorsas a linear combinations of the ones before, or using coordinates).• Given two bases for a vector space, find the corresponding change of basis matrix(p.273 onward).• Find the


View Full Document

Berkeley MATH 54 - MATH 54 Midterm Review

Download MATH 54 Midterm Review
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view MATH 54 Midterm Review and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view MATH 54 Midterm Review 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?