DOC PREVIEW
MIT 18 303 - Notes on function spaces, Hermitian operators, and Fourier series

This preview shows page 1-2-3-4-5 out of 14 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 14 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Notes on function spaces, Hermitian operators, and Fourier series S. G. Johnson, MIT Applied Mathematics November 21, 2007 1 Introduction In 18.06, we mainly worry about matrices and column vectors: finite-dimensional lin-ear algebra. But into the syllabus pops an odd topic: Fourier series. What do these have to do with linear algebra? Where do their interesting properties, like orthogonal-ity, come from? In these notes, written to accompany 18.06 lectures in Fall 2007, we discuss these mysteries: Fourier series come from taking concepts like eigenvalues and eigenvectors and Hermitian matrices and applying them to functions instead of finite column vectors. In this way, we see that important properties like orthogonality of the Fourier series arises not by accident, but as a special case of a much more general fact, analogous to the fact that Hermitian matrices have orthogonal eigenvectors. This material is important in at least two other ways. First, it shows you that the things you learn in 18.06 are not limited to matrices—they are tremendously more general than that. Second, in practice most large linear-algebra problems in science and engineering come from differential operators on functions, and the best way to analyze these problems in many cases is to apply the same linear-algebra concepts to the underlying function spaces. 2 Review: Finite-dimensional linear algebra Most of 18.06 deals with finite-dimensional linear algebra. In particular, let’s focus on the portion of the course having to do with square matrices and eigenproblems. There, we have: • Vectors x: column vectors in Rn (real) or Cn (complex). • Dot products x · y = xH y. These have the key properties: x · x = �x�2 > 0 for x �= 0; x · y = y · x; x · (αy + βz) = αx · y + βx · z. • n × n matrices A. The key fact is that we can multiply A by a vector to get a new vector, and matrix-vector multiplication is linear: A(αx + βy) = αAx + βAy. 1• Transposes AT and adjoints AH = AT . The key property here is that x· (Ay) = (AH x) y . . . the whole reason that adjoints show up is to move matrices from · one side to the other in dot products. • Hermitian matrices A = AH , for which x ·(Ay) = (Ax)·y. Hermitian matrices have three key consequences for their eigenvalues/vectors: the eigenvalues λ are real; the eigenvectors are orthogonal; 1 and the matrix is diagonalizable (in fact, the eigenvectors can be chosen in the form of an orthonormal basis). Now, we wish to carry over these concepts to functions instead of column vectors, and we will see that we arrive at Fourier series and many more remarkable things. 3 A vector space of functions First, let us define a new vector space: the space of functions f(x) defined on x ∈ [0, 1], with the boundary conditions f(0) = f(1) = 0. For simplicity, we’ll restrict ourselves to real f (x). We’ve seen similar vector spaces a few times, in class and on problem sets. This is clearly a vector space: if we add two such functions, or multiply by a constant, we get another such function (with the same boundary conditions). Of course, this is not the only vector space of functions one might be interested in. One could look at functions on the whole real line, or two-dimensional functions f(x, y), or even vector fields and crazier things. But this simple set of functions on [0, 1] will be plenty for now! 4 A dot product of functions To do really interesting stuff with this vector space, we will need to define the dot product (inner product) f g of two functions f(x) and g(x).2 · The dot product of two vectors is the sum of their components multiplied one by one and added (possibly with complex conjugation if they are complex). The corresponding thing for functions is to multiply f(x)g(x) for each x and “add them up”—integrate it: � 1 f g = f(x)g(x)dx. (1)· 0 For real functions, we can drop the complex conjugation of the f(x). Equation (1) is easily seen to satisfy the key properties of dot products: f g = g f; f (αg + βh) = · · · αf · g + βf · h; f · f = �f �2 > 0 for f =� 0. 1At least, the eigenvectors are orthogonal for distinct eigenvalues. In the case where one has multiple independent eigenvectors of the same eigenvalue λ, i.e. the null-space of A − λI is 2-or-more dimensional, we can always orthogonalize them via Gram-Schmidt as we saw in class. So, it is fairer to say that the eigenvectors can always be chosen orthogonal. 2The combination of a vector space and an inner product is called a Hilbert space. (Plus a technical condition called completeness, but that’s only to deal with perverse functional analysts.) 2Actually, the last property is the most tricky: the quantity � 1 f · f = �f �2 = |f(x)|2dx (2) 0 certainly seems like it must be positive for f(x) = 0� . However, you can come up with annoying functions where this is not true. For example, consider the function f(x) that = 0 everywhere except at x = 0.5, where f(0.5) = 1. This f(x) is =� 0, but its �f�2 = 0 because the single point where it is nonzero has zero area. We can eliminate most of these annoyances by restricting ourselves to continuous functions, for example, although adding finite number of jump discontinuities is also okay. In general, though, this raises an important point: whenever you are dealing with functions instead of col-umn vectors, it is easy to come up with crazy functions that behave badly (their integral doesn’t converge or doesn’t even exist, their �f�2 integral is zero, etcetera). Functional analysts love to construct such pathological functions, and defining the precise minimal criteria to exclude them is quite tricky in general, so we won’t try here. Let us colloqui-ally follow the Google motto instead: don’t be evil; in physical problems, pathological functions are rarely of interest. We will certainly exclude any functions where �f�2 is not finite, or is zero for nonzero f(x). Equation (1) is not the only possible way to define a function dot product, of course. � 1For example, 0 f(x)g(x)x dx is also a perfectly good dot product that follows the same rules (and is important for problems in cylindrical coordinates). However, we will stick with the simple eq. (1) definition here, merely keeping in mind the fact that it was a choice, and the best choice may be problem-dependent. 5 Linear operators A square matrix A corresponds to a linear


View Full Document

MIT 18 303 - Notes on function spaces, Hermitian operators, and Fourier series

Download Notes on function spaces, Hermitian operators, and Fourier series
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Notes on function spaces, Hermitian operators, and Fourier series and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Notes on function spaces, Hermitian operators, and Fourier series 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?