Skip to content

SVD and Advanced Decompositions

🚀 SVD & Advanced Decompositions: The Power of Factorization

Matrix factorization allows us to decompose complex datasets into simpler, interpretable components. This is the “Gold Standard” for Dimensionality Reduction and uncovering latent features in massive datasets.


🟢 Level 1: Singular Value Decomposition (SVD)

1. The Universal Decomposition

Unlike eigendecomposition, which requires a square matrix, SVD works for any matrix ARm×n\mathbf{A} \in \mathbb{R}^{m \times n}: A=UΣVT\mathbf{A} = \mathbf{U} \mathbf{\Sigma} \mathbf{V}^T Where:

  • U\mathbf{U} (m×mm \times m): Left singular vectors (orthogonal).
  • Σ\mathbf{\Sigma} (m×nm \times n): Diagonal matrix containing singular values (sorted by magnitude).
  • VT\mathbf{V}^T (n×nn \times n): Right singular vectors (orthogonal).

2. Truncated SVD for Compression

By keeping only the top kk singular values, we create the best kk-rank approximation of A\mathbf{A}. This is the foundation for:

  • Data Compression: Representing a large dataset with fewer numbers.
  • Latent Semantic Analysis (LSA): Finding hidden topics in a set of documents.
  • Image Compression: Keeping only the most important features of an image.
import numpy as np

# A random 5x3 matrix
A = np.random.randn(5, 3)

# Perform SVD
U, s, Vt = np.linalg.svd(A)

print(f"U shape: {U.shape}, s shape: {s.shape}, Vt shape: {Vt.shape}")

🟡 Level 2: Latent Factor Models

3. Collaborative Filtering (RPQTR \approx P \cdot Q^T)

In recommendation systems (like Netflix), we decompose a user-item rating matrix RR into two lower-rank matrices:

  • PP: User features (preferences).
  • QQ: Item features (characteristics). Predicted rating: r^ui=puqiT\hat{r}_{ui} = p_u \cdot q_i^T.

4. Non-Negative Matrix Factorization (NMF)

A variation where all elements in the decomposed matrices must be non-negative. This is particularly useful for topic modeling and image processing, where negative values are physically meaningless.


🔴 Level 3: Tensor Calculus and Spectral Methods

5. Tensor Decompositions (CP and Tucker)

Tensors are multi-dimensional arrays (3D, 4D, etc.). Tensor decompositions generalize SVD to higher dimensions:

  • CP Decomposition: Decomposes a tensor into a sum of rank-1 tensors.
  • Tucker Decomposition: Factoring a tensor into a core tensor and factor matrices for each mode.

6. Spectral Methods and Graph Laplacians

Uses the eigenvalues and eigenvectors of the Graph Laplacian L=DA\mathbf{L} = \mathbf{D} - \mathbf{A} to find clusters in a network. This is the core of Spectral Clustering, which can identify complex, non-linear patterns in data.