DOC PREVIEW
HARVARD MATH 21B - Review Material

This preview shows page 1-2-3 out of 10 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Review Material for the Final Examination of Math 21bJanuary 17, 2008(1) Coefficient matrix and augmented matrix of a system of linear equations.(2) Reduced row-echelon form of a matrix characterized by three conditions:a. If a row has nonzero entries, then the first nonzero entry is 1, called theleading 1 in this row.b. If a column contains a leading 1, then all other entries in that column arezero.c. If a row contains a leading 1, then each row above contains a leading 1further to the left.(3) Reduction of a matrix to reduced row-echelon form by using three kindsof row operations (swapping rows, multiplying a row by a nonzero number,and adding a multiple of a row to another).(4) Use reduction to reduced row-echelon form to determine whether a systemof linear equations is inconsistent, uniquely solvable, or solvable with aninfinite number of solutions and to give a general solution (if it exists) byusing free variables.(5) Use reduction to reduced row-echelon form to determine whether a squarematrix is invertible and to find its inverse if it is invertible.(6) Determinant of a 2 ×2 matrix. Formula for the inverse matrix of a 2 ×2matrix with nonzero determinant.(7) The span of a set of vectors. Redundant vectors in a sequence of vectors.Linear dependence and independence of a set of vectors. Subspaces. Bases.Dimension.(8) Determine the rank and the nullity of a matrix. Find a basis for theimage and for the kernel of a matrix. Rank-Nullity Theorem.(9) Equivalence conditions for the invertibility of an n×n matrix A: uniquesolvability of A~x =~b, rref(A) = In, rank(A) = n, im(A) = Rn, ker(A) = {~0},column vectors forming a basis of Rn, column vectors spanning Rn, columnvectors linearly independent.1(10) Special linear transformations: rotations, dilations, projections (onto aline or a plane), reflections, and shears (horizontal and vertical).(11) The column vectors of the matrix of a linear transformation equal to itsimages of the standard vectors.(12) Relation between matrix multiplication and the composition of lineartransformations.(13) Coordinates with respect a basis of a subspace. Matrix of a lineartransformation with respect to a basis. Relation of matrices of the samelinear transformation with respect to two different bases. Similar matrices.Powers of similar matrices. Similarity as an equivalence relation. The vectorx and and its new coordinates ~y with respect to a basis ~v1, ··· , ~vnare relatedby~x =nXk=1yj~vj=£~v1··· ~vn¤y1···yn= S~y,whereS =£~v1··· ~vn¤, ~y =y1···yn.A matrix A of a linear transformation T from Rnto Rnis related to the newmatrixB =b11··· b1n· ··· ·· ··· ·· ··· ·bn1··· bnn.representing T with respect to ~v1, ··· , ~vnby AS = SB, becauseA~vk=nXk=1bjk~vj=£~v1··· ~vn¤b1k···bnk2andA£~v1··· ~vn¤=nXk=1bjk~vj=£~v1··· ~vn¤b11··· b1n· ··· ·· ··· ·· ··· ·bn1··· bnn.(14) Concept of a linear space (also known as a vector space). Addition andscalar multiplication in a linear space and the laws (associativity, commu-tativity, distributivity, etc.) satisfied by them. Examples of linear spaces:solutions of differential equations, spaces of polynomials, spaces of matrices,etc. Dimension of linear space. Finite and infinite dimension.(15) Gram-Schmidt process of inductively constructing orthonormal vectorsu1, ··· , ~umfrom linearly independent vectors ~v1, ··· , ~vmin Rn. ~v⊥jis theorthogonal projection onto the subspace spanned by ~v1, ··· , ~vj−1(which isthe same as the subspace spanned by ~u1, ··· , ~uj−1) and ~ujis the unit vectorin the direction of ~v⊥j.(16) QR decomposition of an n × m matrix A in the form QR, where Q isan n ×m matrix whose column vectors are orthonormal and R is an m ×mmatrix which is upper triangular with positive diagonal entries.£~v1··· ~vm¤=£~u1··· ~um¤R,where R is the upper triangular m × m matrix whose (i, j)-th entry isri,j= ~ui·~vj.(17) Orthogonal transformations as length-preserving and orthogonality-preservingtransformations. Orthonormal set of vectors and orthonormal basis. Pythagoreantheorem. Cauchy-Schwarz inequality. Angle between vectors.(18) Transpose of a matrix. Product of transposes of matrices and inverse ofthe transpose of a matrix. Symmetric and skew-symmetric matrices. Innerproduct of two vectors as the matrix product of the transpose of a vectorand the other vector: ~v · ~w = ~vT~w. The kernel of the transpose of a matrixas orthogonal complement of the image of the matrix: (im(A))⊥= ker(AT).3(19) Formula QQTfor the orthogonal projection onto a subspace spanned byorthonormal vectors which are the column vectors of Q. Formula A¡ATA¢−1ATfor the orthogonal projection onto a subspace spanned by linearly indepen-dent vectors which are the column vectors of A. Note that the formulaA¡ATA¢−1ATis reduced to QQTwhen A = Q and is reduced to the iden-tity matrix when A is invertible.(20) Formula ~x∗=¡ATA¢−1AT~b for the unique least-squares solution ~x∗of a(possibly inconsistent) system of linear equations A~x =~b when the conditionker(A) = {~0} is satisfied (which is equivalent to the invertibility of ATA andalso equivalent to the linearly independence of the column vectors of A).Application of least-squares solution to finding a polynomial of given degreewhich best fits a given collection of data by regarding the coefficients of thepolynomial as the components of the least-squares solution to the system ofequations defined by data fitting.(21) Determinant defined by induction and expansion down the first column.Determinant computed by expansion down any column and across any row.Formula for the determinant of a 2 × 2 matrix. Formula for the inverse ofan invertible 2 × 2 matrix. Formula for the determinant of a 3 × 3 matrix.Formula for the determinant of an upper or lower triangular matrix. Formulafor the determinant of an upper or lower triangular partitioned matrix withsquare matrices as diagonal entries.(22) Effect of Gauss-Jordan row operations on a determinant. Computationof determinant by Gaussian elimination. Determinant of the transpose of amatrix. Determinant of a product of matrices. Determinant of the inverseof a matrix. Determinants of similar matrices.(23) Characteristic equation of a square matrix. Trace


View Full Document

HARVARD MATH 21B - Review Material

Documents in this Course
Review II

Review II

84 pages

math21b

math21b

27 pages

Syllabus

Syllabus

12 pages

Basis

Basis

2 pages

Basis

Basis

2 pages

Load more
Download Review Material
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Review Material and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Review Material 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?