Understanding Eigenvalues, Eigenvectors, and the Spectral Theorem
The Fascinating World of Linear Algebra
Linear algebra is a fundamental area of mathematics with vast applications in fields like physics, engineering, computer science, and beyond. One of the core concepts in linear algebra is the idea of eigenvalues and eigenvectors, which comes into play when dealing with linear transformations and matrices. These concepts are deeply connected to the Spectral Theorem, which provides insight into the structure of matrices and the transformations they represent.
What are Eigenvalues and Eigenvectors?
To dive into this topic, let's consider the eigenvalues of an matrix with real entries. These eigenvalues are the roots of the characteristic polynomial , which can be complex. They're defined as the values for which there exists a vector such that . Such a pair () is known as an eigenvector, eigenvalue pair.
The Spectral Theorem
The Spectral Theorem is a fascinating result that tells us that every real symmetric matrix can be diagonalized by an orthogonal matrix. That is, we can find a matrix such that is a diagonal matrix. This theorem is particularly useful because it allows us to understand the matrix in terms of its eigenvalues and eigenvectors.
The Problem at Hand
Consider the following problem, which provides a fantastic application of the concepts we've just discussed:
Suppose that the matrix is diagonalizable, that is, for an invertible matrix , where is diagonal. Using the notation for the columns of , show that , so that the eigenvalues/eigenvector pairs of are .
The Proof
The problem asks us to prove that each column of is an eigenvector of corresponding to the eigenvalue on the diagonal of . Here’s how we can prove this assertion:
- We start with the equation .
- Multiplying both sides by , we get .
- This equation implies that each column of is scaled by the corresponding .
- It follows that for each column , proving that is an eigenvector of with eigenvalue .
Spectral Theorem and Symmetric Matrices
A matrix is orthogonal if . The spectral theorem, perhaps one of the most important theorems in linear algebra, states that if is symmetric, that is, , then is diagonalizable by a real orthogonal matrix. That is, there are a diagonal matrix and orthogonal matrix such that , or, equivalently, .
Let denote the ith eigenvalue of .
Problem Statement
(b) Let be symmetric. Show that if is orthogonal, where and , then is an eigenvector of and , where .
Answer
If is orthogonal, then .
(because is a diagonal matrix)
Positive Semi-Definite Matrices and Eigenvalues
Understanding the nature of positive semi-definite (PSD) matrices is crucial in various domains of mathematics and applied sciences. These matrices have the characteristic that they do not produce negative outputs when applied to quadratic forms.
Problem Statement
(c) Show that if is PSD, then for each .
Solution
Given that is a PSD matrix, this implies for any vector , .
Considering the spectral decomposition of , where and is an orthogonal matrix, we can express the quadratic form using this decomposition:
Here, we let . Since is orthogonal, is simply the vector represented in the basis of eigenvectors of . The expression sums the products of each eigenvalue with the square of the -th component of .
For being PSD, it must hold that for every . This implies for every , and since , each eigenvalue must be non-negative to ensure the sum is non-negative.
Therefore, it is proven that for each , confirming that all eigenvalues of a PSD matrix are non-negative.