DOC PREVIEW
UTK CS 594 - Iterative Methods in Linear Algebra

This preview shows page 1-2-16-17-18-33-34 out of 34 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 34 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Iterative Methods in Linear Algebra(part 2)Stan TomovInnovative Computing LaboratoryComputer Science DepartmentThe University of TennesseeWednesday April 15, 2009CS 594-0004, 04-15-2009CS 594-0004, 04-15-2009OutlinePart IKrylov iterative solversPart IIConvergence and preconditioningPart IIIIterative eigen-solversCS 594-0004, 04-15-2009Part IKrylov iterative solversCS 594-0004, 04-15-2009Krylov iterative solversBuilding blocks for Krylov iterative solvers covered so farProjection/minimization in a subspacePetrov-Galerkin conditionsLeast squares minimization, etc.OrthogonalizationCGS and MGSCholesky or Householder based QRCS 594-0004, 04-15-2009Krylov iterative solversWe also covered abstract formulations foriterative solvers and eigen-solversWhat are the goals of this lecture?Give specific examples of Krylov solversShow how examples relate to the abstract formulationShow how examples relate to the building blocks covered sofar, specidicly toProjection, andOrthogonalizationBut we are not going into the details!CS 594-0004, 04-15-2009Krylov iterative solversHow are these techniques related to Krylov iterative solvers?Remember projection slides 26 & 27, Lecture 7 (left)Projection in a subspace is the basis for an iterative methodHere projection is in VIn Krylov methods V is the Krylov subspaceKm(A, r0) = span{r0, Ar0, A2r0, . . . , Am−1r0}where r0≡ b − Ax0and x0is an initial guess.Often V or W are orthonormalizedThe projection is ’easier’ to find when we work with anorthonormal basis(e.g. problem 4 from homework 5: projection in generalvs orthonormal basis)The orthonormalization can be CGS, MGS, Cholesky orHouseholder based, etc.CS 594-0004, 04-15-2009Krylov Iterative MethodsTo summarize, Krylov iterative methods in generalexpend the Krylov subspace by a matrix-vector product, anddo a projection in it.Various methods result by specific choices of the expansion andprojection.CS 594-0004, 04-15-2009Krylov Iterative MethodsA specific example with theConjugate Gradient Method (CG)CS 594-0004, 04-15-2009Conjugate Gradient MethodThe method is for SPD matricesBoth V and W are the Krylov subspaces, i.e. at iteration iV ≡ W ≡ Ki(A, r0) ≡ span{r0, Ar0, . . . , Ai−1r0}The projection xi∈ Ki(A, r0) satisfies the Petrov-Galerkinconditions(Axi, φ) = (b , φ), for ∀φ ∈ Ki(A, r0)CS 594-0004, 04-15-2009Conjugate Gradient Method (continued)At every iteration there is a way (to be shown later) to construct a new search direction pisuch thatspan{p0, p1, . . . , pi} ≡ Ki+1(A, r0) and (Api, pj) = 0 for i 6= j.Note: A is SPD ⇒ (Api, pj) ≡ (pi, pj)Acan be used as an inner product,i.e. p0, . . . , piis an (·, ·)Aorthogonal basis for Ki+1(A, r0)⇒ we can easily find xi+1≈ x asxi+1= x0+ α0p0+ · · · + αipis.t.(Axi+1, pj) = (b, pj) for j = 0, . . . , iNamely, because of the (·, ·)Aorthogonality of p0, . . . , piat iteration i + 1 we have to find only αi(Axi+1, pj) = (A(xi+ αipi), pi) = (b, pi), ⇒ αi=(ri, pi)(Api, pi)Note: xiabove actually can be replaced by any x0+ v , v ∈ Ki(A, r0) (Why?)CS 594-0004, 04-15-2009Conjugate Gradient Method (continued)Conjugate Gradient Method1: Compute r0= b − Ax0for some initial guess x02: for i = 0 to ... do3: ρi= rTiri4: if i = 0 then5: p0= r06: else7: pi= ri+ρiρi−1pi−18: end if9: qi= Api10: αi=ρipiTqi11: xi+1= xi+ αipi12: ri+1= ri− αiqi13: check convergence; continue if necessary14: end forNote:One matrix vector product/iteration (at line 9)Two inner-products/iteration (lines 3 and 10)In exact arithmetic ri+1= b − Axi+1(Apply A to both sides of 11 and subtract from b toget line 12)Update for xi+1is as pointed out before, i.e. withαi=(ri, ri)(Api, pi)=(ri, pi)(Api, pi)since (ri, pi−1) = 0 (exercise)Other relations to be proved (exercise)pis’ span the Krylov spacepis’ are (·, ·)Aorthogonal, etc.CS 594-0004, 04-15-2009Conjugate Gradient Method (continued)To sum it up:In exact arithmetic we get the exact solution in at most nsteps, i.e.x = x0+ α0p0+ · · · + αipi+ αi+1pi+1+ · · · + αn−1pn−1At every iteration one more term αjpjis added to the currentapproximationxi= x0+ α0p0+ · · · + αi−1pi−1xi+1= x0+ α0p0+ · · · + αi−1pi−1+αipi≡ xi+ αipiNote: we do not have to solve linear system at every iterationbecause of the A-orthogonal basis that we manage tomaintain and expend at every iterationIt can be proved that the error ei= x − xisatisfies||ei||A≤ 2 pk(A) − 1pk(A) + 1!i||e0||ACS 594-0004, 04-15-2009Building orthogonal basis for a Krylov subspaceBuilding orthogonal basis for a Krylov subspaceWe have seen the importance inDefining projectionsnot just for linear solversAbstract linear solvers and eigen-solver formulationsA specific examplein CG where the basis for the Krylov subspaces is A-orthogonal(A is SPD)We have seen how to build itCGS, MGS, Cholesky or Householder based, etc.These techniques can be used in a method specificallydesigned for Krylov subspaces (general non-Hermitian matrix),namely in theArnoldi’s MethodCS 594-0004, 04-15-2009Arnoldi’s MethodArnoldi’s method:Build an orthogonal basis for Km(A, r0)A can be general, non-Hermitian1: v1= r02: for j = 1 to m do3: hij= (Avj, vi) for i = 1, .., j4: wj= Avj− h1jv1− ... − hjjvj5: hj+1,j= ||wj||26: if hj+1,j= 0 Stop7: vj+1=wjhj+1,j8: end forNote:This orthogonalization is based on CGS (line 4)wj= Avj− (Avj, v1)v1− ... − (Avj, vj)vj⇒ up to iteration j vectorsv1, ..., vjare orthogonalThe space of this orthogonal basis grows by taking thenext vector to be AvjIf we do not exit at step 6 we will haveKm(A, r0) = span{v1, v2, . . . , vm}(exercise)CS 594-0004, 04-15-2009Arnoldi’s Method (continued)Arnoldi’s method in matrix notationDenoteVm≡ [v1, . . . , vm], Hm+1= {hij}m+1×mand by Hmthe matrix Hm+1without the last row.Note that Hmis upper Hessenberg (0s below the lower secondsub-diagonal) andAVm= VmHm + wmeTmVTmAVm= Hm(exercise)CS 594-0004, 04-15-2009Arnoldi’s Method (continued)Variations:Explained using CGSCan be implemented with MGS, Householder, etc.How to use it in linear solvers?Example with the Full Orthogonalization Method (FOM)CS 594-0004, 04-15-2009FOMFOM1: β = ||r0||22: Compute v1, . . . , vmwith Arnoldi3: ym= βH−1me14: xm= x0+ VmymLook for solution in the formxm= x0+ ym(1)v1+ · · · + ym(m)vm≡ x0+ VmymPetrov-Galerkin conditions will beVTmAxm= VTmb⇒ VTmA(x0+ Vmym) = VTmb⇒ VTmAVmym= VTmr0⇒ Hmym= VTmr0= βe1which is given by steps 3 and 4 of the algorithmCS 594-0004,


View Full Document

UTK CS 594 - Iterative Methods in Linear Algebra

Documents in this Course
Load more
Download Iterative Methods in Linear Algebra
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Iterative Methods in Linear Algebra and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Iterative Methods in Linear Algebra 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?