Eigenvalues and Eigenvectors
Disclaimer: These are my personal notes compiled for my own reference and learning. They may contain errors, incomplete information, or personal interpretations. While I strive for accuracy, these notes are not peer-reviewed and should not be considered authoritative sources. Please consult official textbooks, research papers, or other reliable sources for academic or professional purposes.
1. Definition
For a square matrix $A$, a non-zero vector $v$ is an eigenvector if:
where $\lambda$ is the corresponding eigenvalue.
2. Characteristic Equation
The eigenvalues are solutions to the characteristic equation:
This is a polynomial equation in $\lambda$ of degree $n$ (where $A$ is $n \times n$).
3. Finding Eigenvalues
3.1 For $2 \times 2$ Matrix
For $A = \begin{bmatrix} a & b \\ c & d \end{bmatrix}$:
This gives the quadratic equation:
3.2 For Larger Matrices
Use row operations or software to find roots of the characteristic polynomial.
4. Finding Eigenvectors
For each eigenvalue $\lambda$, solve the system:
The solution space is the eigenspace corresponding to $\lambda$.
5. Properties
- Trace: $\text{tr}(A) = \sum_{i=1}^n \lambda_i$
- Determinant: $\det(A) = \prod_{i=1}^n \lambda_i$
- Power: If $Av = \lambda v$, then $A^k v = \lambda^k v$
- Inverse: If $A$ is invertible, $A^{-1}v = \frac{1}{\lambda}v$
6. Diagonalization
A matrix $A$ is diagonalizable if there exists an invertible matrix $P$ such that:
where the columns of $P$ are eigenvectors of $A$.
7. Conditions for Diagonalization
- Sufficient: $A$ has $n$ linearly independent eigenvectors
- Necessary: All eigenvalues are real (for real matrices)
- Special Case: Symmetric matrices are always diagonalizable
8. Spectral Decomposition
For a symmetric matrix $A$:
where $Q$ is orthogonal (eigenvectors) and $\Lambda$ is diagonal (eigenvalues).
9. Jordan Canonical Form
When a matrix is not diagonalizable, it can be written as:
where $J$ is a block diagonal matrix with Jordan blocks.
10. Applications
10.1 Principal Component Analysis (PCA)
Eigenvalues of the covariance matrix give the variance explained by each principal component.
10.2 Dynamical Systems
For the system $x_{n+1} = Ax_n$, the long-term behavior depends on the eigenvalues of $A$.
10.3 Quantum Mechanics
Observables are represented by Hermitian matrices, and eigenvalues correspond to possible measurement outcomes.
11. Power Method
To find the dominant eigenvalue and eigenvector:
The eigenvalue is approximated by:
12. Code Example
# Python code for eigenvalues and eigenvectors
import numpy as np
from scipy import linalg
# Create a matrix
A = np.array([[4, -2], [-2, 4]])
# Find eigenvalues and eigenvectors
eigenvals, eigenvecs = np.linalg.eig(A)
print("Eigenvalues:", eigenvals)
print("Eigenvectors:")
print(eigenvecs)
# Verify: Av = λv
for i in range(len(eigenvals)):
lambda_i = eigenvals[i]
v_i = eigenvecs[:, i]
Av = A @ v_i
lambda_v = lambda_i * v_i
print(f"Av = {Av}")
print(f"λv = {lambda_v}")
print(f"Difference: {np.linalg.norm(Av - lambda_v)}")
# Diagonalization
D = np.diag(eigenvals)
P = eigenvecs
P_inv = np.linalg.inv(P)
# Verify: A = PDP^(-1)
reconstructed = P @ D @ P_inv
print(f"Original A:\n{A}")
print(f"Reconstructed A:\n{reconstructed}")
13. References
- Strang, G. (2016). Introduction to Linear Algebra.
- Lay, D. C. (2015). Linear Algebra and Its Applications.
- Hoffman, K., & Kunze, R. (1971). Linear Algebra.