Unformatted text preview:

Lecture 33 Linear Transformation T c v c v c T v c T v If we know know T v T v and if v v a basis for the input means that every vector is one combination of the v s If we know T v for v basis vectors v v Then we know T v for all v Next When is T invertible Know that T v w where j 1 n w w must be a basis for the output space A v v w w AV W when is A invertible We know V is invertible A WV therefore A is invertible when W is invertible m x n matrix r rank output space dim col space r we know T x Ax m not necessarily m unique columns dim nullspace n r dim column space dim nullspace dim input space dim input space r Every linear transformation T can be described as a matrix T v x v x linear because it is linear in v input space v x a a x a x basis input and output 1 x x dim input space 3 output space apply basis T 1 0 T x x T x 2x notice that it is like eigenvectors with eigenvalues 0 1 2 matrix same as dim outputs col space range 2 dim nullspace kernel 1 dim input space 3 Thus T v x v x 0 a x 2a x Change of Basis Finding the matrix need to know T need to choose input basis and output basis Example T v v identity transformation input basis v and v output basis matrix for T Rule apply T to each basis vector v v know T v T v Now same T identity input basis v and v output basis Column 1 comes from T v a w a w a a


View Full Document

MIT 18 06 - Lecture 33

Download Lecture 33
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture 33 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 33 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?