Linear Algebra Foundation based Homework Problems - Eigenvectors, eigenvalues, and the spectral theorem

About Linear Algebra Foundation


Understanding Eigenvalues, Eigenvectors, and the Spectral Theorem

The Fascinating World of Linear Algebra

Linear algebra is a fundamental area of mathematics with vast applications in fields like physics, engineering, computer science, and beyond. One of the core concepts in linear algebra is the idea of eigenvalues and eigenvectors, which comes into play when dealing with linear transformations and matrices. These concepts are deeply connected to the Spectral Theorem, which provides insight into the structure of matrices and the transformations they represent.

What are Eigenvalues and Eigenvectors?

To dive into this topic, let's consider the eigenvalues of an nimesnn imes n matrix AA with real entries. These eigenvalues are the roots of the characteristic polynomial pA(λ)=det(λIA)p_A(\lambda) = \text{det}(\lambda I - A), which can be complex. They're defined as the values λC\lambda \in \mathbb{C} for which there exists a vector xCnx \in \mathbb{C}^n such that Ax=λxAx = \lambda x. Such a pair (x,λx, \lambda) is known as an eigenvector, eigenvalue pair.

The Spectral Theorem

The Spectral Theorem is a fascinating result that tells us that every real symmetric matrix can be diagonalized by an orthogonal matrix. That is, we can find a matrix TT such that T1ATT^{-1}AT is a diagonal matrix. This theorem is particularly useful because it allows us to understand the matrix AA in terms of its eigenvalues and eigenvectors.

The Problem at Hand

Consider the following problem, which provides a fantastic application of the concepts we've just discussed:

Suppose that the matrix ARn×nA \in \mathbb{R}^{n \times n} is diagonalizable, that is, A=Tdiag(λ1,,λn)T1A = T\text{diag}(\lambda_1, \ldots, \lambda_n)T^{-1} for an invertible matrix TT, where diag(λ1,,λn)\text{diag}(\lambda_1, \ldots, \lambda_n) is diagonal. Using the notation t(i)t^{(i)} for the columns of TT, show that At(i)=λit(i)At^{(i)} = \lambda_i t^{(i)}, so that the eigenvalues/eigenvector pairs of AA are (t(i),λi)(t^{(i)}, \lambda_i).

The Proof

The problem asks us to prove that each column of TT is an eigenvector of AA corresponding to the eigenvalue on the diagonal of diag(λ1,,λn)\text{diag}(\lambda_1, \ldots, \lambda_n). Here’s how we can prove this assertion:

  1. We start with the equation A=Tdiag(λ1,,λn)T1A = T\text{diag}(\lambda_1, \ldots, \lambda_n)T^{-1}.
  2. Multiplying both sides by TT, we get AT=Tdiag(λ1,,λn)AT = T\text{diag}(\lambda_1, \ldots, \lambda_n).
  3. This equation implies that each column t(i)t^{(i)} of TT is scaled by the corresponding λi\lambda_i.
  4. It follows that At(i)=λit(i)At^{(i)} = \lambda_i t^{(i)} for each column t(i)t^{(i)}, proving that t(i)t^{(i)} is an eigenvector of AA with eigenvalue λi\lambda_i.

Spectral Theorem and Symmetric Matrices

A matrix URnimesnU \in \mathbb{R}^{n imes n} is orthogonal if UTU=IU^TU = I. The spectral theorem, perhaps one of the most important theorems in linear algebra, states that if ARnimesnA \in \mathbb{R}^{n imes n} is symmetric, that is, A=ATA = A^T, then AA is diagonalizable by a real orthogonal matrix. That is, there are a diagonal matrix ΛRnimesn\Lambda \in \mathbb{R}^{n imes n} and orthogonal matrix URnimesnU \in \mathbb{R}^{n imes n} such that UTAU=ΛU^TAU = \Lambda, or, equivalently, A=UΛUTA = U\Lambda U^T.

Let λi=λi(A)\lambda_i = \lambda_i(A) denote the ith eigenvalue of AA.

Problem Statement

(b) Let AA be symmetric. Show that if U=[u(1)u(n)]U = [u^{(1)} \ldots u^{(n)}] is orthogonal, where u(i)Rnu^{(i)} \in \mathbb{R}^n and A=UΛUTA = U\Lambda U^T, then u(i)u^{(i)} is an eigenvector of AA and Au(i)=λiu(i)Au^{(i)} = \lambda_iu^{(i)}, where Λ=extdiag(λ1,,λn)\Lambda = ext{diag}(\lambda_1, \ldots, \lambda_n).

Answer

If UU is orthogonal, then U1=UTU^{-1} = U^T.

AU=UΛUTU=UΛU1U=UΛ=AU\begin{align*} AU &= U\Lambda U^TU \\ &= U\Lambda U^{-1}U \\ &= U\Lambda \\ &= AU \end{align*}

(because Λ\Lambda is a diagonal matrix)

Positive Semi-Definite Matrices and Eigenvalues

Understanding the nature of positive semi-definite (PSD) matrices is crucial in various domains of mathematics and applied sciences. These matrices have the characteristic that they do not produce negative outputs when applied to quadratic forms.

Problem Statement

(c) Show that if AA is PSD, then λi(A)0\lambda_i(A) \geq 0 for each ii.

Solution

Given that AA is a PSD matrix, this implies for any vector xx, xTAx0x^TAx \geq 0.

Considering the spectral decomposition of AA, where A=UΛUTA = U\Lambda U^T and UU is an orthogonal matrix, we can express the quadratic form xTAxx^TAx using this decomposition:

xTAx=xTUΛUTx=(UTx)TΛ(UTx)=yTΛy=i=1nλiyi2\begin{align*} x^TAx &= x^T U\Lambda U^T x \\ &= (U^Tx)^T \Lambda (U^Tx) \\ &= y^T\Lambda y \\ &= \sum_{i=1}^{n} \lambda_i y_i^2 \end{align*}

Here, we let y=UTxy = U^Tx. Since UU is orthogonal, yy is simply the vector xx represented in the basis of eigenvectors of AA. The expression yTΛyy^T\Lambda y sums the products of each eigenvalue λi\lambda_i with the square of the ii-th component of yy.

For AA being PSD, it must hold that xTAx0x^TAx \geq 0 for every xx. This implies _i=1nλiyi20\sum\_{i=1}^{n} \lambda_i y_i^2 \geq 0 for every yy, and since yi20y_i^2 \geq 0, each eigenvalue λi\lambda_i must be non-negative to ensure the sum is non-negative.

Therefore, it is proven that λi0\lambda_i \geq 0 for each ii, confirming that all eigenvalues of a PSD matrix are non-negative.