Help
What is SVD?
Singular Value Decomposition factors any matrix A into: A = U Σ V^T
- U: Orthogonal matrix with left singular vectors
- Σ: Diagonal matrix with singular values (σ₁ ≥ σ₂ ≥ ... ≥ 0)
- V^T: Orthogonal matrix with right singular vectors
Geometric Interpretation
SVD shows that any transformation is a sequence of three operations:
- Rotation (by V^T)
- Scaling along principal axes (by Σ)
- Another rotation (by U)
Low-Rank Approximation
By keeping only the k largest singular values, you get the best rank-k approximation:
A_k = σ₁u₁v₁^T + σ₂u₂v₂^T + ... + σₖuₖvₖ^T
This is optimal in the sense that it minimizes ||A - A_k||.
Applications
- PCA: Principal Component Analysis uses SVD
- Image Compression: Keep top singular values
- Recommender Systems: Matrix factorization
- Noise Reduction: Remove small singular values
- Linear Regression: Solve least squares problems
Singular Values vs Eigenvalues
Singular values are always non-negative and exist for any matrix.
For a symmetric matrix A:
- Singular values = |eigenvalues|
- Singular values of A = √eigenvalues of A^T A