New version page

# UW-Madison CS 513 - Lecture 13: Least squares via QR-factorization

Pages: 21
Documents in this Course

17 pages

24 pages

13 pages

15 pages

18 pages

2 pages

2 pages

2 pages

3 pages

2 pages

## This preview shows page 1-2-20-21 out of 21 pages.

View Full Document
Do you want full access? Go Premium and unlock all 21 pages.
Do you want full access? Go Premium and unlock all 21 pages.
Do you want full access? Go Premium and unlock all 21 pages.
Do you want full access? Go Premium and unlock all 21 pages.

Unformatted text preview:

More on least squares Application Least squares approximation Lecture 13 Least squares via QR factorization Amos Ron University of Wisconsin Madison March 01 2021 Amos Ron CS513 remote learning S21 More on least squares Application Least squares approximation Outline 1 More on least squares QR factoring a rectangular matrix Orthogonal transformation to least squares 2 Application Least squares approximation The approximation problem An example Amos Ron CS513 remote learning S21 More on least squares Application Least squares approximation QR factoring a rectangular matrix Orthogonal transformation to least squares Outline 1 More on least squares QR factoring a rectangular matrix Orthogonal transformation to least squares 2 Application Least squares approximation The approximation problem An example Amos Ron CS513 remote learning S21 More on least squares Application Least squares approximation QR factoring a rectangular matrix Orthogonal transformation to least squares Blank page Amos Ron CS513 remote learning S21 More on least squares Application Least squares approximation QR factoring a rectangular matrix Orthogonal transformation to least squares Back to QR factorization Reviewing the factorization step In the jth step of the Householder algorithm for QR factorization We create a Householder Hj based on x Aj 1ej We de ne Aj HjAj 1 The rst j 1 st columns of Aj 1 are preserved in Aj Ajej becomes a good column When computing Aj we need to update all the columns k j 1 m Ajek Hj Aj 1ek k j 1 How to adapt if Am n m n Amos Ron CS513 remote learning S21 More on least squares Application Least squares approximation QR factoring a rectangular matrix Orthogonal transformation to least squares Back to QR factorization How to adapt if Am n m n Adapting QR factorization to rectangular A There are only n steps not m 1 steps In the jth step of the Householder algorithm for QR factorization We create a Householder Hj based on x Aj 1ej We de ne Aj HjAj 1 The rst j 1 st columns of Aj 1 are preserved in Aj Ajej becomes a good column When computing Aj we need to update all the columns k j 1 n Ajek Hj Aj 1ek k j 1 Amos Ron CS513 remote learning S21 More on least squares Application Least squares approximation QR factoring a rectangular matrix Orthogonal transformation to least squares Back to QR factorization How to adapt if Am n m n Adapting QR factorization to rectangular A There are only n steps not m 1 steps In the jth step of the Householder algorithm for QR factorization We create a Householder Hj based on x Aj 1ej We de ne Aj HjAj 1 The rst j 1 st columns of Aj 1 are preserved in Aj Ajej becomes a good column When computing Aj we need to update all the columns k j 1 n Ajek Hj Aj 1ek k j 1 Short demo Amos Ron CS513 remote learning S21 More on least squares Application Least squares approximation QR factoring a rectangular matrix Orthogonal transformation to least squares Orthogonal transformation to least squares We are given a least squares problem Ax b and multiply both sides by an orthgonal Qm m QAx Qb If x is a solution of the new system then Ax b 2 Q Ax b 2 QAx Qb 2 QAx Qb 2 Q Ax b 2 Ax b 2 So x is also the least squares solution of the original problem Amos Ron CS513 remote learning S21 More on least squares Application Least squares approximation QR factoring a rectangular matrix Orthogonal transformation to least squares Orthogonal transformation to least squares Solving least squares via QR factorization Step I Factor A QR Qm m Rm n Solve the least squares Rx Q cid 48 b We only still need to know how to solve least square with an upper triangular matrix Amos Ron CS513 remote learning S21 More on least squares Application Least squares approximation QR factoring a rectangular matrix Orthogonal transformation to least squares Orthogonal transformation to least squares Solving least squares via QR factorization Step I Factor A QR Qm m Rm n Solve the least squares Rx Q cid 48 b We only still need to know how to solve least square with an upper triangular matrix So we need to know how to solve Rx b with Rm n m n upper triangular R i j 0 do that i j How to Amos Ron CS513 remote learning S21 More on least squares Application Least squares approximation QR factoring a rectangular matrix Orthogonal transformation to least squares Orthogonal transformation to least squares So we need to know how to solve Rx b with Rm n m n upper triangular R i j 0 Discard all the equations i n 1 m Solve the resulting square system i j Algorithm Solving least square via QR factorization QR factor A Remove from Q all columns j n 1 m Q1 Q 1 n Solve the square n n upper triangular system Q cid 48 1Ax Q cid 48 1b Amos Ron CS513 remote learning S21 More on least squares Application Least squares approximation QR factoring a rectangular matrix Orthogonal transformation to least squares Theoretical explanation of what we did Set W range A Assume that the columns w1 wn of A are a basis for W We need Ax b W i e wi Ax b 0 i 1 n which is equivalent to the condition A cid 48 Ax b 0 Amos Ron CS513 remote learning S21 More on least squares Application Least squares approximation QR factoring a rectangular matrix Orthogonal transformation to least squares Theoretical explanation of what we did Set W range A Assume that the columns w1 wn of A are a basis for W We need Ax b W i e wi Ax b 0 i 1 n which is equivalent to the condition A cid 48 Ax b 0 The only thing the QR factorization does is computing a new basis for W q1 qn qi Q i So we need qi Ax b 0 i 1 n which is equivalent to the condition Amos Ron CS513 remote learning S21 Q cid 48 1 Ax b 0 More on least squares Application Least squares approximation QR factoring a rectangular matrix Orthogonal transformation to least squares Theoretical explanation of what we did Set W range A Assume that the columns w1 wn of A are a basis for W We need Ax b W i e wi Ax b 0 i 1 n which is equivalent to the condition A cid 48 Ax b 0 The punch line the condition number of the new equation cond2 A cid 48 A cond2 A 2 cond2 Q cid 48 1A cond2 A Amos Ron CS513 remote learning S21 More on least squares Application Least squares approximation The approximation problem An example Outline 1 More on least squares QR factoring a rectangular matrix Orthogonal transformation to least squares 2 Application Least squares approximation The approximation problem An example Amos Ron CS513 remote learning S21 More on least squares Application Least squares approximation The approximation problem An

View Full Document
Download Lecture 13: Least squares via QR-factorization