There is no commentSelect some text and then click Comment, or simply add a comment to this page from below to start a discussion.
Lecture 6: Spectral Theorem for Compact Operators
More on Compact Operators
In the last lecture we defined compact operators, and stated that they are equivalent to norm limits of finite rank operators. We begin by proving one direction of this theorem (the other direction is left to the homework).
Proof. Assume , let be the unit ball, and let . Let be an integer. The set is an open cover of so it must have a finite subcover; call this . Let be the orthogonal projection onto . Note that is a finite rank operator since is finite.
Consider a point . Choose such that . Since is the closest point to in , we must have whence , so that as .
Two interesting families of compact operators are the following.
Integral Kernel Operators. Suppose . A Cauchy-Schwartz exercise shows that , where Let be an ONB of . It is easy to check that the family of bivariate functions is an ONB of , so we may expand Let As the difference is an integral kernel operator of the same type, we have as , showing that is compact by the previous theorem.
Diagonal Operators. If is any sequence with , then the diagonal multiplication operator is compact, since it is approximated by .
Note that diagonal multiplication operators in , for are not compact, as seen by considering to be the indicator of any set of positive Lebesgue measure, since then the image contains infinitely many vectors pairwise separated by a constant distance.
Finally, we mention that the set of compact operators on is a 2-sided ideal in , which means that if then for any . These properties are easy to verify from the characterization as of as norm limits of finite rank operators.
The Spectral Theorem for Compact Operators
We will now show that the diagonal operators above are in a sense the only examples of compact operators, up to isomorphism.
Definition. If for some then is called an eigenvector of and is called an eigenvalue.
Theorem. Suppose . Then:
The eigenvalues of are real and may be ordered .
If is an eigenvalue of , the eigenspace is finite dimensional.
There is an orthonormal basis of consisting of eigenvectors of .
The last property implies that we have the expansion where denotes the linear functional dual to . Alternately, this can be expressed as where is unitary and is a diagonal multiplication operator.
Proof of Theorem. We begin by observing that for any eigenvectors and : Plugging in shows that so must be real. For we then must have , so eigenvectors from distinct eigenvalues must be orthogonal.
Lemma 1. For every , the subspace is finite dimensional. Proof of Lemma. We first show that for every nonzero eigenvalue , is finite dimensional. Assume not, i.e., there is an infinite sequence of orthonormal vectors such that . Then and we have whenever . But compactness implies that every sequence in must have a convergent subsequence, so this is impossible.
The orthogonality of distinct eigenspaces (which are closed since they are kernels) implies that Assume for contradiction that there are infinitely many direct summands, and choose one unit eigenvector from each eigenspace. By the same argument above, we have whenever , since the hypotenuse of a right triangle is longer than its shorter side. This is impossible by compactness.
Lemma 1 implies properties (1) and (2) and that there are at most countably many eigenvalues. We now show that there are enough to form an orthonormal basis; the key is to show that we can always find one eigenvector, and the rest will follow by induction.
Lemma 2. Either or is an eigenvalue of .
Proof of Lemma. Let . If then we are done since and any unit vector in the kernel will do. Otherwise assume and let be a sequence such that . By compactness of we may pass to a subsequence such that converges to some vector .
Let , and observe that for every , since for every . Observe that Since is positive it has a square root , so we have Since is a bounded operator and , this implies which by continuity of implies . Thus we have If then is an eigenvalue; otherwise is an eigenvalue with eigenvector .
To finish the proof of the theorem, let denote the dimension of and let denote a sequence of orthonormal eigenvectors for all of the (countably many by Lemma 1) nonzero eigenvalues . Set (meaning the set of finite linear combinations). Observe that by construction, and since is continuous this implies On the other hand, if and we have since . Thus, both and are closed invariant subspaces of .
Observe that if , the restricted operator is also compact. Since it has no nonzero eigenvectors (by construction), Lemma 2 implies that it must satisfy , which means whenever . Thus, every vector in is an eigenvector of with eigenvalue .
To complete the proof, take to be any orthonormal basis of . Since the union is an ONB of consisting of eigenvectors of , as desired.
Remark. As pointed out by Tarun in class, it is possible to prove Lemma 2 by considering instead of if one assumes , which can be achieved by possibly replacing with .
Remark. Yeshwanth pointed out that one can also prove Lemma 2 by showing that , which can be seen by expanding the left hand side in inner products. This proof has the advantage of not using a square root.