Skip to post content
3 min readDaniel Kosbab

Eigenvalues are a coordinate system that fits the object

Linear algebra is a tool for doing operations on things. Eigenvalues are the tool for describing the operation in the coordinate system that fits it best.

That's the whole idea. The rest is mechanism.

The geometric picture

A linear transformation, represented as a matrix, sends each vector to another vector. For most directions, a vector gets both rotated and stretched. For some special directions, the vector only gets stretched. Those special directions are the eigenvectors.

If Ax = λx, then:

  • x is an eigenvector. The direction is preserved.
  • λ is the eigenvalue. The factor by which x is stretched.

If λ = 2, the vector doubles. If λ = -1, it flips. If λ = 0, it collapses to zero. If λ is complex, the transformation involves rotation in the plane spanned by related eigenvectors.

Why this is a coordinate system

For most "nice" matrices, notably symmetric ones, the eigenvectors form a full basis. Any vector can be written as a sum of eigenvectors.

In that basis, the action of A is trivially simple. It scales each eigenvector component by its eigenvalue. No rotation, no mixing, just scaling.

This is what diagonalization means. You rewrite the matrix in a coordinate system where it is diagonal: a matrix with values only on the diagonal, zeroes everywhere else. A diagonal matrix is as simple as a matrix gets.

The payoff: a hard matrix operation in the original coordinates becomes multiple independent scalar operations in the eigenbasis.

Where this gets used

  • PCA. The directions of maximum variance in your data are the eigenvectors of the covariance matrix. Their eigenvalues are the variances along those directions. "The first two principal components" means "the two eigenvectors with the largest eigenvalues."
  • Google PageRank. The ranking is the dominant eigenvector of the link matrix. The eigenvalue is 1, because the matrix is stochastic by construction.
  • Quantum mechanics. Observables are the eigenvalues of Hermitian operators. The basis the operator diagonalizes is the basis in which that quantity has definite values.
  • Markov chains. The long-run distribution is an eigenvector of the transition matrix with eigenvalue 1.
  • Differential equations. Linear systems dx/dt = Ax have solutions that decompose into the eigenspaces of A. Each eigenvalue tells you the growth or decay rate along that direction.
  • Spectral clustering. Cluster structure in a graph shows up as gaps in the eigenvalue spectrum of the graph Laplacian.

Different problems. Same trick. Rotate to the basis where the operation is just scaling, solve in that basis, rotate back.

Why the naming is bad

"Eigen" is German for "own" or "characteristic." Eigenvectors are the matrix's own vectors, in the sense that they are the ones the matrix respects. Eigenvalues are the matrix's own scaling factors.

The name is telling you: these are the intrinsic descriptors of the operation, not the vectors you happened to start with.

The shift that makes this click

The thing that changes when you understand eigenvalues is not the math. The math is straightforward once you've seen it.

What changes is the instinct to rewrite problems in the coordinates that fit them, instead of grinding through them in whatever coordinates you started with. That instinct is what linear algebra is actually teaching.

Most matrices are lying to you about their own behavior. They come in the wrong coordinates. Eigenvalues are how you get the truth.

© 2026 Daniel Kosbab

Built with love and Tailwind CSS