Linear Algebra
Transformations, eigenvectors, and the geometry of matrices
1. Building Intuition
Forget rows and columns of numbers for a moment. A matrix is a transformation — a machine that takes every point in space and moves it somewhere else. Multiplying a vector by a matrix transforms it: rotating, stretching, shearing, or reflecting.
The key insight from 3Blue1Brown: to understand a matrix, watch what it does to the basis vectors î and ĵ. The columns of the matrix are simply where î and ĵ land after the transformation. Everything else follows from linearity.
The grid deforms but gridlines remain parallel and evenly spaced — that's what makes it linear. The red arrow (î) and green arrow (ĵ) are the basis vectors. Their landing positions become the columns of the matrix.
2. The Mathematics
A linear transformation T maps vectors to vectors while preserving addition and scalar multiplication. Every such transformation can be represented as matrix multiplication:
The determinant measures how the transformation scales areas (2D) or volumes (3D). A determinant of 0 means the transformation collapses a dimension — the matrix is singular.
Eigenvectors are the special directions that don't rotate during the transformation — they only get scaled. The scaling factor is the eigenvalue:
The Singular Value Decomposition (SVD) reveals the fundamental geometry of any matrix: it's always a rotation, then a stretch along coordinate axes, then another rotation.
3. Applications
As the matrix [[2,1],[1,2]] is applied, gray vectors change direction. But the cyan eigenvector [1,1] scales by λ=3, and the yellow eigenvector [1,-1] stays unchanged (λ=1). Eigenvectors reveal the "natural axes" of a transformation.
Google PageRank
Eigenvector of the web link matrix — the dominant eigenvector ranks pages by importance.
Principal Component Analysis
Eigenvectors of the covariance matrix identify the directions of maximum variance in data.
Computer Graphics
Every rotation, scaling, and projection in 3D graphics is a matrix multiplication.
Quantum Mechanics
Observables are matrices (operators), measurements are eigenvalues, states are eigenvectors.
4. Worked Examples
Example 1: 2D Rotation Matrix
Rotating every point by angle θ counterclockwise. Where does î = [1,0] land? At [cos θ, sin θ]. Where does ĵ = [0,1] land? At [-sin θ, cos θ]. These become the columns:
det(R) = cos²θ + sin²θ = 1 — rotation preserves areas. R-1 = RT = R(-θ) — it's an orthogonal matrix.
Example 2: Finding Eigenvalues of A = [[2, 1], [1, 2]]
For λ₁ = 3: (A - 3I)v = 0 gives v₁ = [1, 1]T. For λ₂ = 1: v₂ = [1, -1]T.
Geometric meaning: The matrix stretches space by factor 3 along the diagonal [1,1] direction and leaves the anti-diagonal [1,-1] direction unchanged.
Example 3: SVD and Image Compression
Any matrix A (even non-square) can be decomposed as A = UΣVT:
- VT — rotate input space (align with "natural axes")
- Σ — stretch along each axis (singular values)
- U — rotate output space
Image compression: An m×n image A can be approximated by keeping only the top k singular values: A ≈ UkΣkVkT. This reduces storage from m×n to k(m+n+1) — for a 1000×1000 image, keeping k=50 gives 95% storage savings with minimal visual loss.