DOC PREVIEW
UCSD CSE 252C - Sparsity in Linear Least Squares

This preview shows page 1-2-14-15-30-31 out of 31 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Linear Least Squares ProblemIs Sparsity Useful?Characterization of LS solutionsTime Complexity of Direct MethodsWhat kind of sparsity is useful?Sparse Data StructuresGaussian Elimination: Fill-inProcessing Order and Fill-inProcessing Order and Fill-inSteps in a Sparse LS ProblemGraph for Sparse Symmetric MatrixPredicting Structure of $A ^ op A $Predicting Structure of Cholesky Factor $R $Filled GraphStructure of Cholesky FactorEfficient Computation of Filled GraphFill Minimizing Column OrderingsMinimum Degree OrderingWithout OrderingWith Minimum Degree OrderingDoes not always work!Nested Dissection OrderingNested Dissection: Block Structure and Elimination TreeNumerical FactorizationNumerical Cholesky FactorizationCholesky vs QRSparse QR algorithmQR-DecompositionRow Sequential QR AlgorithmSummarySparsity in Linear Least SquaresGraph Theoretic Approaches to SparseFactorizationManmohan Krishna ChandrakerCSE 252C, Fall 2004, UCSD– p.1Linear Least Squares Problem•Find x ∈ Rnthat minimizesminxkAx − bk, A ∈ Rm×n, b ∈ Rm, m ≥ n.•Residual vector, r = b − Ax.•Sparse LLS : A is sparse.Sparse Linear Least Squares – p.2Is Sparsity Useful?•Electric grids•Geodetic measurements•Bundle adjustmentSparse Linear Least Squares – p.3Characterization of LS solutions•Normal Equations•x is a solution to the Least Squares problem if andonly ifA⊤Ax = A⊤b•Solution method : Cholesky Decomposition•QR decomposition•minxkAx − bk = minxkQ⊤(Ax − b)k for Q ∈ SO(m).Sparse Linear Least Squares – p.4Time Complexity of DirectMethods•Structure of A influences choice of algorithm.Ax = b•Dense - Gaussian elimination :23n3+ O(n2) flops.•Symmetric, positive definite - Choleskydecomposition :13n3+ O(n2) flops.•Triangular - Simple substitution : n2+ O(n ) .Sparse Linear Least Squares – p.5What kind of sparsity is useful?•When there are O(n) non-zero entries.•Sparse data structures include more storageoverhead.•Arithmetic operations are slower (due to indirectaddressing).•When the sparsity has a pattern.Sparse Linear Least Squares – p.6Sparse Data StructuresStatic, compressed row storageA =a110 a130 0a21a220 0 00 0 a330 00 a420 a4400 0 0 a54a550 0 0 0 a65AC = (a11, a13| a22, a21| a33| a42, a44| a54, a55| a65)IA = (1, 3, 5, 6, 8, 10, 11)JA = (1, 3, 2, 1, 3, 2, 4, 4, 5, 5)Sparse Linear Least Squares – p.7Gaussian Elimination: Fill-inFill-in : Non-zero elements created by Gaussianelimination.A = A(1)="a r⊤c¯A#where a ∈ R1×1,¯A ∈ R(n−1)×(n−1)A(1)="1 0caI#"a r⊤0 A(2)#⇒ A(2)=¯A −cr⊤aRepeat same for A(2), say, A(2)= L2U2.Sparse Linear Least Squares – p.8Processing Order and Fill-in•Column order greatly affects fill-in.× × ×× × ×× ×× × ×× ×× × ×× × ×× ×× × ×× ×•Find the column order that minimizes fill-in :Sparse Linear Least Squares – p.9Processing Order and Fill-in•Column order greatly affects fill-in.× × ×× × ×× ×× × ×× ×× × ×× × ×× ×× × ×× ×•Find the column order that minimizes fill-in :NP-complete!!Sparse Linear Least Squares – p.10Steps in a Sparse LS ProblemSymbolic factorization•Find the structure of A⊤A.•Determine a column order that reduces fill-in.•Predict the sparsity structure of the decomposition andallocate storage.Numerical solution•Read numerical values into the data structure.•Do a numerical factorization and solve.Sparse Linear Least Squares – p.11Graph for Sparse SymmetricMatrix× × × ×× × ×× × × ×× ×× ×× ×× ×•A node vifor each column i•(vi, vj) ∈ E ⇔ aij6= 0Symbolic Factorization – p.12Predicting Structure of A⊤A•A⊤A =mXi=1aia⊤iwhere a⊤i= i-th row of A.•G(A⊤A) = direct sum of G(a⊤iai),i = 1, · · · , m.•Non-zeros in a⊤iform a clique subgraph.Symbolic Factorization – p.13Predicting Structure of CholeskyFactor RElimination Graphs : Represent fill-in duringfactorization.Symbolic Factorization – p.14Filled Graph•GF(A) : Direct sum of elimination graphs.Symbolic Factorization – p.15Structure of Cholesky Factor•Filled graph bounds structure of Cholesky factorG(R + R⊤) ⊂ GF(A)•Equality when no-cancellation holds.A R× × × ×× × ×× × × ×× ×× ×× ×× ×× × × ×× × × ×× × × × ×× × × ×× × ×× ××Symbolic Factorization – p.16Efficient Computation of FilledGraph•Theorem : Let G(A) = (V, E) be an orderedgraph of A. Then (vi, vj) is an edge of thefilled graph GF(A) if and only if (vi, vj) ∈ E orthere is a path in G(A) from vertex vito vjpassing only through vertices with numberingless than min{i, j}.•Allows construction of filled graph in O(n|E|)time.Symbolic Factorization – p.17Fill Minimizing ColumnOrderings•Reordering rows of A does not affect A⊤A.•Column reordering in A keeps number ofnon-zeros same in A⊤A.•Greatly affects number of non-zeros in R.•Heuristics to reduce fill-in :•Minimum degree ordering•Nested dissection orderings.Symbolic Factorization – p.18Minimum Degree OrderingLet G(0)= G(A).for i = 1, · · · , (n − 1) :Select a node v in G(i−1)of minimal degree.Choose v as next pivot.Update elimination graph to get G(i).endSymbolic Factorization – p.19Without OrderingOrder 1, 2, 3, 4, 5, 6, 7 → fill-in = 10.Symbolic Factorization – p.20With Minimum Degree OrderingOrder 4, 5, 6, 7, 1, 2, 3 → fill-in = 0 !!Symbolic Factorization – p.21Does not always work!•Order 1, 2, · · · , 9 → fill-in = 0.•Minimum degree node : 5 → fills in (4, 6).Symbolic Factorization – p.22Nested Dissection Ordering•Reorder to obtain block angular sparsematrix.1234Symbolic Factorization – p.23Nested Dissection: BlockStructure and Elimination TreeA =24A1B1A2B235A =2666664A1B1D1A2B2D2A3C3D3A4C4D437777751234Symbolic Factorization – p.24Numerical Factorization•Mathematically, Cholesky factor of A⊤Asame as R in QR-decomposition of A.•Numerical issues govern choice of algorithm.•Symbolic factorization same for both.Numerical Factorization – p.25Numerical Cholesky


View Full Document

UCSD CSE 252C - Sparsity in Linear Least Squares

Download Sparsity in Linear Least Squares
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Sparsity in Linear Least Squares and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Sparsity in Linear Least Squares 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?