3.3 Conditioning of Problems vs Stability of Algorithms

3.3 Conditioning of Problems vs Stability of Algorithms

When engineers debug numerical issues, the default instinct is to suspect a bug in code, a flaw in the algorithm, or a mistake in implementation. But in numerical computation, the enemy is often more subtle: the problem itself may be inherently sensitive to small changes.

This is the idea of conditioning—a property not of your code, but of the mathematical problem. And confusing conditioning with algorithmic stability is one of the most common and costly misunderstandings in engineering.

To build reliable systems, we need to know when the difficulty lies in the algorithm—and when the difficulty lies in the problem itself.


Conditioning: How Sensitive Is the Problem?

Consider solving a system Ax = b. Suppose you slightly perturb b to b + δb. How much does x change?

If small perturbations in input produce small perturbations in output, the problem is well-conditioned.
If small perturbations produce large changes in output, the problem is ill-conditioned.

This sensitivity is measured using the condition number, usually written as κ(A). Intuitively:

κ(A) ≈ how much the output can change relative to the input

A condition number near 1 means the problem is stable and predictable. A condition number in the millions—or worse, the billions—means the problem is dangerously fragile.

And crucially: Conditioning is a property of the problem. No amount of clever coding can fix it.


Ill-Conditioned Problems: When the Universe Is Against You

Ill-conditioning shows up everywhere:

  • Inverting nearly singular matrices
  • Solving least-squares problems with highly correlated features
  • Computing eigenvalues of matrices with close or repeated eigenvalues
  • Deep learning networks with extreme scale differences in weights
  • Optimization problems with flat or extremely steep curvature

Ill-conditioning does not mean your algorithm is wrong. It means the problem is whispering:

“Even if you solved me perfectly, I would still behave badly.”

Understanding this difference saves enormous time. Instead of rewriting the same solver four times or blaming hardware quirks, you recognize that the instability is not internal—it is intrinsic.


Stability: How Sensitive Is the Algorithm Itself?

If conditioning measures the sensitivity of the problem, stability measures the sensitivity of the algorithm.

A stable algorithm controls and limits error amplification. An unstable algorithm magnifies tiny floating-point errors into catastrophic results.

An algorithm is considered numerically stable if:

It produces the exact solution to a nearby problem.

This definition aligns with backward error analysis: a stable algorithm doesn’t promise perfection—it promises robustness.


Conditioning vs Stability: A Crucial Matrix

Putting the two concepts side by side:

Problem Algorithm Outcome
Well-conditioned Stable Excellent results
Well-conditioned Unstable Bad results (algorithm is at fault)
Ill-conditioned Stable As good as the mathematics allows
Ill-conditioned Unstable Disaster

This table reveals the often-overlooked truth: Stability cannot fix ill-conditioning, but instability can ruin well-conditioned problems.

You must analyze both sides of the equation.


Examples: When Conditioning or Stability Matters Most

1. Solving Linear Systems: LU vs QR

LU decomposition is efficient and widely used—but only stable with pivoting. QR decomposition is slower but significantly more stable for many problems.

If the problem is poorly conditioned (e.g., a nearly singular matrix), switching solvers cannot fix the underlying issue.

2. Least Squares Problems

Solving (AᵀA)x = Aᵀb directly is notoriously unstable. Using QR avoids squaring the condition number and dramatically improves reliability.

3. Deep Learning

Exploding/vanishing gradients are often not the fault of the optimizer but a symptom of:

  • poorly conditioned loss landscapes, or
  • unstable update algorithms.

Normalization, initialization strategies, and optimizer choice all influence stability.


When Problems and Algorithms Align

Great numerical computing happens when:

  • the problem is well-conditioned (or transformed to be so), and
  • the algorithm is stable.

That combination produces results that are not only accurate, but trustworthy.

And this leads to an important engineering principle:

If results look unstable, check the conditioning before touching the algorithm.


Transition: From Stability to Real Implementation

Conditioning tells us when a problem is inherently sensitive. Stability tells us how well an algorithm withstands that sensitivity.

But there is still a gap between the elegant world of mathematical algorithms and the messy world of software systems.

On paper, algorithms behave exactly. In practice, they run on hardware, execute in floating-point, and accumulate approximation errors step by step.

To understand the difference between the algorithm we intend to run and the algorithm a machine actually runs, we now turn to:

3.4 Exact Algorithms vs Implemented Algorithms

2025-09-15

Shohei Shimoda

I organized and output what I have learned and know here.