7.2 Householder Reflections

7.2 Householder Reflections

When Gram–Schmidt begins to wobble under floating-point arithmetic, and even Modified Gram–Schmidt starts showing cracks for ill-conditioned matrices, most numerical linear algebra libraries turn to a remarkably elegant tool: Householder reflections.

Householder reflections are not only more stable than Gram–Schmidt — they are predictably stable. This reliability makes them the foundation of modern QR decomposition implementations in LAPACK, NumPy, MATLAB, and almost every production-grade numerical library.

To understand why, we need to look past the algebra and see the geometry. A Householder reflection does one very powerful thing:

It takes a vector and reflects it across a plane so that it becomes aligned with a basis vector.

This simple geometric act — reflecting across a hyperplane — turns out to be one of the most numerically robust operations available for building orthogonal matrices.


What is a Householder reflection?

Given a vector x, a Householder transformation constructs a matrix H such that:

H x = α e₁

where e₁ is the first standard basis vector and α has the same norm (up to sign) as x. In other words:

  • Start with any vector.
  • Apply one reflection.
  • It becomes a vector pointing straight along an axis.

That means if we apply this operation to columns of a matrix, we can “zero out” the entries below a pivot without performing fragile subtraction.

The reflection matrix has the form:

H = I − 2 (v vᵀ) / (vᵀ v)

where v is a specially chosen vector constructed from x.

Even though H is an n × n matrix, we never explicitly construct it in that form — doing so would be extremely wasteful. Instead, we store just v and compute H x in a few efficient operations.


Why reflections are more stable than Gram–Schmidt

Let’s compare the strategies:

  • Gram–Schmidt subtracts projections.
    This leads to catastrophic cancellation.
  • Modified GS reduces this but still depends on repeated subtractions.
  • Householder uses a single reflection operation.
    There is no repeated subtraction of nearly parallel vectors.

The key property:

Householder transformations are perfectly orthogonal up to machine precision.

An orthogonal matrix preserves norms and angles — and because reflections construct orthogonal matrices directly, their numerical behavior is more predictable than Gram–Schmidt’s brittle manipulations.

The error analysis of Householder QR shows that its orthogonality loss grows slowly and is well-bounded, whereas Gram–Schmidt’s error can explode for near-dependent columns.

This is why:

  • LAPACK uses Householder QR.
  • NumPy’s np.linalg.qr uses Householder QR.
  • MATLAB uses Householder QR.
  • Any high-reliability solver in scientific computing uses Householder QR.

In short: if you care about accuracy, you use Householder reflections.


How a Householder QR works (high level)

Consider a matrix A. To compute QR, we apply a sequence of reflections:

  1. Use a reflection to zero out entries under the first pivot.
  2. Apply a new reflection to the submatrix that starts at row 2, column 2.
  3. Repeat until the matrix above the diagonal is upper triangular.

Each reflection eliminates an entire column of “below-diagonal clutter” in a single, stable sweep.

The reflections compound into an orthogonal matrix:

Q = H₁ H₂ … Hₖ

And the final upper-triangular matrix is:

R = Qᵀ A

This decomposition is then used everywhere: solving least-squares problems, computing eigensystems, fitting regression models, and stabilizing machine-learning algorithms.


A geometric intuition that sticks

Imagine throwing a ball at a mirror. The reflected direction is predictable and symmetric. Now imagine applying that reflection in a high-dimensional space — not to a single point, but to every vector in the space.

That is what a Householder matrix does: it defines a mirror (a hyperplane) and reflects all vectors across it.

This simple physical analogy captures the essence of stability:

  • No delicate subtractions.
  • No growing round-off chains.
  • A single, well-conditioned geometric operation.

You don’t lose orthogonality because the entire transformation is built from objects that are orthogonal.


Why engineers should love Householder reflections

If you spend enough time debugging numerical weirdness in production ML/AI systems, you eventually develop strong opinions about which operations “behave” and which cause trouble.

Householder reflections behave.

They deliver reliable orthogonality, stable decompositions, and well-conditioned solutions to large systems. Their implementation is efficient on modern hardware, and they parallelize well — making them ideal for GPUs and high-performance clusters.

In systems that rely heavily on least-squares or regression — recommendation models, denoising pipelines, kernels, and any ML model under the hood — Householder QR is the algorithm that keeps things sane.

And this takes us naturally to the next major application: solving least squares problems.


Where we go next

We now know how Householder QR builds a stable orthogonal basis. But QR is not just an elegant decomposition — it is the backbone of one of the most widely used tools in applied math: least squares.

Let’s move on to understand why solving Ax ≈ b requires more than algebra — it requires stability, structure, and the QR viewpoint.

2025-10-03

Shohei Shimoda

I organized and output what I have learned and know here.