Unformatted text preview:

18.06 Spring 2009 Exam 2 PracticeGeneral commentsExam 2 covers the first 18 lectures of 18.06. It does not cover determinants (lectures 19 and 20). There will also be noquestions on graphs and networks. The topics covered are (very briefly summarized):1. All of the topics from exam 1.2. Linear independence [key point: the columns of a matrix A are independent if N(A) = {0}], bases (an indepen-dent set of vectors that spans a space), and dimension of subspaces (the number of vectors in any basis).3. The four fundamental subspaces (key points: their dimensions for a given rank r and m ×n matrix A, theirrelationship to the solutions [if any] of Ax = b, their orthogonal complements, and how/why we can find basesfor them via the elimination process).4. What happens to the four subspaces as we do matrix operations, especially elimination steps and more generallyhow the subspaces of AB compare to those of A and B. The fact (important for projection and least-squares!)that ATA has the same rank as A, the same null space as A, and the same column space as AT, and why (weproved this in class and another way in homework).5. Orthogonal complements S⊥for subspaces S, especially (but not only) the four fundamental subspaces.6. Orthogonal projections: given a matrix A, the projection of b onto C(A) is p = A ˆx where ˆx solves ATA ˆx = ATb[always solvable since C(ATA) = C(AT)]. If A has full column rank, then ATA is invertible and we can write theprojection matrix P = A(ATA)−1AT(so that A ˆx = Pb, but it is much quicker to solve ATA ˆx = ATb by eliminationthan to compute P in general). e = b −A ˆx is in C(A)⊥= N(AT), and I −P is the projection matrix onto N(AT).7. Least-squares: ˆx minimizes kAx −bk2over all x, and is the least-squares solution. That is, p = A ˆx is the closestpoint to b in C(A). Application to least-square curve fitting, minimizing the sum of the squares of the errors.8. Orthonormal bases, forming the columns of a matrix Q with QTQ = I. The projection matrix onto C(Q) is justQQT, and ˆx = QTb. Obtaining Q from A (i.e., an orthonormal basis from any basis) by Gram-Schmidt, and thecorrespondence of this process to A = QR factorization where R = QTA is invertible and upper-triangular. UsingA = QR to solve equations (either Ax = b or ATA ˆx = ATb). Q is an orthogonal matrix only if it is square, inwhich case QT= Q−1.9. Dot products of functions, and hence Gram-Schmidt, orthonormal bases (e.g. Fourier series or orthogonalpolynomials), orthogonal projection, and least-squares for functions.As usual, the exam questions may turn these concepts around a bit, e.g. giving the answer and asking you to workbackwards towards the question, or ask about the same concept in a slightly changed context. We want to know thatyou have really internalized these concepts, not just memorizing an algorithm but knowing why the method works andwhere it came from.1Some practice problemsThe 18.06 web site has exams from previous terms that you can download, with solutions. I’ve listed a few practiceexam problems that I like below, but there are plenty more to choose from. (Note: exam 2 in several previous termsasked about determinants; we won’t have any determinant questions until exam 3.) The exam will consist of 3 or 4questions (perhaps with several parts each), and you will have one hour. You can find the solutions to these problemson the 18.06 web site (in the section for old exams/psets). On the last page I give practice problems for orthogonalfunctions and orthogonal projections of functions.1. (Fall 2002 exam 2.) (a) Choose c and the last column of Q so that you have an orthogonal matrix:Q = c1 −1 −1 ?−1 1 −1 ?−1 −1 −1 ?−1 −1 1 ?.(b) Project b = (1, 1,1,1)Tonto the first column of Q. Then project b onto the plane spanned by the first twocolumns. (c) Suppose the last column of this matrix (where the ?’s are) were changed to (1,1, 1,1)T. Call thisnew matrix A. If Gram-Schmidt is applied to the 4 columns of A, what would be the 4 outputs q1, q2, q3, q4?(Don’t do a lot of calculations...please!)2. (Fall 2008 exam 2.) [The parts of this question are independent and can be done in any order.] (a) P is theprojection matrix onto C(A), where A has independent columns. Q is a square orthogonal matrix with the samenumber of rows as A. In its simplest form, in terms of P and Q, what is the projection matrix onto the columnspace of QA? (b) The vectors a, b, and c are independent. The matrix P is the projection matrix onto the span ofa and b. Suppose we apply Gram-Schmidt onto the vectors a, b,and c to produce orthonormal vectors q1,q2,andq3. Write the unit vector q3in simplest form in terms of P and c only. (c) The vectors a, b, and c are independent,and the matrix A has these three vectors as its columns. You are given the QR decomposition of A, where Q isorthogonal and R is 3 ×3 upper-triangular as usual. Write kck in terms of only the elements of R, in simplestform.3. (Fall 2008 exam 2.) Suppose we have obtained from measurements n data points (ti,bi) and you are asked tofind a best least-squares fit function of the form y = C + Dt + E(1 −t). Are C, D, and E uniquely determined?Write down a solvable system of equations that gives a solution to the least-squares problem.4. (Fall 2008 exam 2.) (a) If A is invertible, must the column space of A−1be the same as the column space of A?(b) If A is square, must the column space of A2be the same as the column space of A?5. (Fall 2005 exam 1.) Suppose A is m×n with linearly dependent columns. Complete with as much true informa-tion as possible: (a) The rank of A is .....? (b) The nullspace of A contains .....? (c) The equation ATy = b has nosolution for some right-hand sides b because ......? (more words needed)6. (Fall 2005 exam 1.) Suppose A is the 3 ×4 matrixA =1 2 3 42 3 4 53 4 5 6.(a) A basis for C(A) is .....? (b) For which vectors b = (b1,b2,b3)Tdoes Ax = b have a solution? (Give specificconditions on b1,2,3.) (c) Explain why there is no 4 ×3 matrix B for which AB = I (3 ×3). Give a good reason(the mere fact that A is rectangular is not sufficient).7. (Spring 2005 exam 1.) Suppose the columns of a 7 ×4 matrix A are linearly independent. (a) After rowoperations reduce A to U or R, how many rows will be all zero (or is it impossible to tell)? (b) Assume that norow swaps were required for elimination. What is the row space of A? Explain why this equation will surely


View Full Document

MIT 18 06 - Exam 2 Practice

Download Exam 2 Practice
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Exam 2 Practice and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Exam 2 Practice 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?