The Cambrian explosion in quantum error correction

Phantom codes, QLDPC codes, iceberg codes, cat qubits, neural decoders. Five years ago, surface codes were the only game in town. Now there's a genuine competition, and it might get us to fault tolerance faster than any single approach could.

The Cambrian explosion in quantum error correction

Something unusual is happening in quantum error correction research. After a decade where the surface code dominated every roadmap and every funding proposal, the first months of 2026 have produced a flurry of fundamentally different approaches. Each has distinct advantages. Each is backed by serious teams. And each claims breakthroughs that would have seemed outlandish two years ago.

This isn't incremental progress on a single front. It's a Cambrian explosion of competing ideas. The diversity itself may be the most important part.

The surface code's long reign

To understand why the current moment matters, you need to understand how dominant the surface code has been. Proposed in 1997 and refined over the following decade, the surface code became the default assumption for fault-tolerant quantum computing because of one property: it has a relatively high error threshold. Physical qubits with error rates below roughly 1%, which modern hardware can achieve, are sufficient for the surface code to function.