Eigenvalues and Eigenvectors

Last updated: December 2024

Disclaimer: These are my personal notes compiled for my own reference and learning. They may contain errors, incomplete information, or personal interpretations. While I strive for accuracy, these notes are not peer-reviewed and should not be considered authoritative sources. Please consult official textbooks, research papers, or other reliable sources for academic or professional purposes.

1. Definition

For a square matrix $A$, a non-zero vector $v$ is an eigenvector if:

$$Av = \lambda v$$

where $\lambda$ is the corresponding eigenvalue.

2. Characteristic Equation

The eigenvalues are solutions to the characteristic equation:

$$\det(A - \lambda I) = 0$$

This is a polynomial equation in $\lambda$ of degree $n$ (where $A$ is $n \times n$).

3. Finding Eigenvalues

3.1 For $2 \times 2$ Matrix

For $A = \begin{bmatrix} a & b \\ c & d \end{bmatrix}$:

$$\det(A - \lambda I) = \begin{vmatrix} a-\lambda & b \\ c & d-\lambda \end{vmatrix} = (a-\lambda)(d-\lambda) - bc = 0$$

This gives the quadratic equation:

$$\lambda^2 - (a+d)\lambda + (ad-bc) = 0$$

3.2 For Larger Matrices

Use row operations or software to find roots of the characteristic polynomial.

4. Finding Eigenvectors

For each eigenvalue $\lambda$, solve the system:

$$(A - \lambda I)v = 0$$

The solution space is the eigenspace corresponding to $\lambda$.

5. Properties

6. Diagonalization

A matrix $A$ is diagonalizable if there exists an invertible matrix $P$ such that:

$$P^{-1}AP = D = \text{diag}(\lambda_1, \lambda_2, \ldots, \lambda_n)$$

where the columns of $P$ are eigenvectors of $A$.

7. Conditions for Diagonalization

8. Spectral Decomposition

For a symmetric matrix $A$:

$$A = Q\Lambda Q^T$$

where $Q$ is orthogonal (eigenvectors) and $\Lambda$ is diagonal (eigenvalues).

9. Jordan Canonical Form

When a matrix is not diagonalizable, it can be written as:

$$A = PJP^{-1}$$

where $J$ is a block diagonal matrix with Jordan blocks.

10. Applications

10.1 Principal Component Analysis (PCA)

Eigenvalues of the covariance matrix give the variance explained by each principal component.

10.2 Dynamical Systems

For the system $x_{n+1} = Ax_n$, the long-term behavior depends on the eigenvalues of $A$.

10.3 Quantum Mechanics

Observables are represented by Hermitian matrices, and eigenvalues correspond to possible measurement outcomes.

11. Power Method

To find the dominant eigenvalue and eigenvector:

$$v_{k+1} = \frac{Av_k}{\|Av_k\|}$$

The eigenvalue is approximated by:

$$\lambda \approx \frac{v_k^T A v_k}{v_k^T v_k}$$

12. Code Example

# Python code for eigenvalues and eigenvectors
import numpy as np
from scipy import linalg

# Create a matrix
A = np.array([[4, -2], [-2, 4]])

# Find eigenvalues and eigenvectors
eigenvals, eigenvecs = np.linalg.eig(A)

print("Eigenvalues:", eigenvals)
print("Eigenvectors:")
print(eigenvecs)

# Verify: Av = λv
for i in range(len(eigenvals)):
    lambda_i = eigenvals[i]
    v_i = eigenvecs[:, i]
    Av = A @ v_i
    lambda_v = lambda_i * v_i
    print(f"Av = {Av}")
    print(f"λv = {lambda_v}")
    print(f"Difference: {np.linalg.norm(Av - lambda_v)}")

# Diagonalization
D = np.diag(eigenvals)
P = eigenvecs
P_inv = np.linalg.inv(P)

# Verify: A = PDP^(-1)
reconstructed = P @ D @ P_inv
print(f"Original A:\n{A}")
print(f"Reconstructed A:\n{reconstructed}")

13. References