SUPERPOSITIONED

The Quantum Decade Ahead

A Bottoms-Up Analysis of the Quantum Computing Industry

Saneel Sreeni

Q1 2026

A PERPETUAL DRAFT

A Note to the Reader

This report is written for a mixed audience: senior technologists who want precise specifications, and investors, policymakers, and strategic planners who need the big picture without a physics degree. To serve both groups, the report weaves together two layers: the main text provides full technical detail with specific numbers, citations, and analysis, while plain-language explanations are interspersed throughout to translate key concepts into everyday terms. If you already know what a qubit is or how error correction works, you can skim the simpler passages. If not, they are your on-ramp. All data, market figures, and stock prices in this report are accurate to the author's knowledge as of Q1 2026.

Key Concepts: A 60-Second Primer

Before diving in, here are the foundational ideas that appear on almost every page of this report:

Qubit — The basic unit of quantum information. A classical computer bit can be either 0 or 1. A qubit can be 0, 1, or a “superposition” of both simultaneously, which is what gives quantum computers their theoretical power. Think of a coin: a classical bit is a coin lying flat, showing heads or tails. A qubit is the coin mid-spin, simultaneously partaking in both states until you look at it.

Physical Qubit vs. Logical Qubit — A physical qubit is the actual hardware device (an ion, a superconducting circuit, a photon). A logical qubit is a collection of physical qubits working together to represent one reliable unit of information, using error correction to compensate for the noisiness of the individual physical qubits.

Imagine each physical qubit as an unreliable employee who sometimes makes typos. A logical qubit is a team of those employees proofreading each other's work. The team's output is far more accurate than any individual's, but you need several people to produce one clean page.

Gate Fidelity (“Nines”) — The accuracy of a quantum operation (a “gate”). 99% fidelity means 1 error per 100 operations. 99.9% means 1 error per 1,000. 99.99% means 1 error per 10,000. Each additional “nine” is an order-of-magnitude improvement. This report tracks fidelity “nines” because they are the single most predictive measure of when quantum computers become useful.

Imagine if every keystroke had a chance of being wrong: too many errors and you can't finish a sentence, but improve accuracy enough and you could write a whole novel with only minor typos. Quantum computers need to reach that "novel-writing" level of reliability, and each additional "nine" of accuracy gets them closer.

Coherence Time — How long a qubit retains its quantum information before environmental noise destroys it. Measured in microseconds (μs) for superconducting qubits or seconds for trapped ions. Longer is better: if your computation takes longer than your qubits stay coherent, you lose your answer to noise.

Think of coherence time as how long you can balance a pen on your fingertip before it falls. The longer the balance, the more complex a calculation you can finish before your information "falls over."

Quantum Error Correction (QEC) — The technique of spreading quantum information across many physical qubits so that errors can be detected and fixed without destroying the computation. It is the single most important enabling technology for practical quantum computing.

Just like a scratched disc can still be read using redundant data, quantum error correction uses redundancy to recover information even when individual qubits glitch. The breakthrough was proving this actually works: adding more qubits to the system makes it more reliable, not less.

Fault-Tolerant Quantum Computing — A quantum computer that can correct errors faster than they accumulate, enabling arbitrarily long computations. "Fault-tolerant" does not mean error-free; it means the system can detect and fix errors in real time, so that the computation stays on track indefinitely. This is the critical threshold separating today's noisy, limited machines from the commercially useful quantum computers of the future. No system has achieved full fault tolerance yet.

Think of it like flying a plane that can fix its own engine problems mid-flight. Today's quantum computers are more like gliders—they work for short stretches but cannot sustain themselves. A fault-tolerant quantum computer would be a plane that never needs to land for repairs, no matter how long the journey.

Surface Code — The most widely studied error correction code for quantum computing. It arranges physical qubits in a 2D grid and is relatively simple to implement on superconducting chips. Its downside: it requires roughly 1,000 physical qubits per logical qubit at current noise levels—a huge overhead.

Quantum Advantage — The point at which a quantum computer solves a specific problem faster, cheaper, or better than any classical computer can. “Quantum supremacy” (an older term) referred to solving any problem faster, even an artificial one. “Quantum advantage” increasingly refers to solving a useful problem faster.

Quantum supremacy is like proving you can run faster than a car on a tightrope—impressive, but not practical. Quantum advantage is proving you can deliver a package faster on a real road, in real traffic. The field is trying to move from the first to the second.

Close-up of Google's Sycamore quantum processor chip showing the grid of superconducting qubits
A quantum processor chip. The quantum computing industry reached several inflection points in late 2024 and 2025 — milestones that form the basis of this report. Credit: Google/Wikimedia Commons (CC-BY 3.0)

Introduction: The Year Quantum Got Real

In a clean room in Broomfield, Colorado, ninety-eight barium ions hover in electromagnetic suspension above a chip the size of a thumbnail. Each ion is a qubit—a quantum bit—and together they constitute the computational heart of Quantinuum’s Helios, the quantum computer with the highest reported two-qubit gate fidelity[9] of any commercial system. The ions are spaced mere microns apart, held in place by precisely tuned electric fields, shuttled through junctions and logic zones by a choreography of laser pulses that would have seemed impossible five years ago. The machine's two-qubit gate fidelity is 99.921%. Using the [[4,2,2]] Iceberg error-detection code, it produced ninety-four error-detected logical qubits—and through code concatenation, forty-eight logical qubits at a 2:1 physical-to-logical encoding ratio—numbers that, as recently as 2023, most experts in the field considered unattainable within this decade. An important distinction: the ninety-four logical qubits in the GHZ state are error-detected qubits using the [[4,2,2]] Iceberg code—a distance-2 code that detects errors through post-selection (discarding faulty results) rather than actively correcting them. The forty-eight logical qubits achieved via code concatenation represent a stronger result: fully error-corrected logical qubits at a 2:1 encoding ratio, though still using relatively short-distance codes. Neither yet constitutes the deep fault-tolerant error correction required at scale.

Quantinuum has built the most accurate commercial quantum computer using individual atoms floating above a tiny chip, controlled by lasers. It can spot errors but not fix them—like spell-check that highlights typos but requires you to throw out the page and start over. Another approach can actually fix errors, but needs far more hardware to do it.

Three thousand miles away, in Google’s quantum computing facility in Santa Barbara, a different kind of breakthrough sits inside a dilution refrigerator cooled to fifteen millikelvins—colder than outer space. Google’s Willow chip, a 105-qubit superconducting processor, achieved something in December 2024 that the quantum computing community had been chasing for nearly thirty years: below-threshold error correction—the point at which adding more qubits to an error-correcting system makes it more reliable, not less. For the first time since the quantum fault-tolerance threshold was theorized in the late 1990s[2]—building on Shor's pioneering error-correcting code from 1995 and Steane's independent work in 1996—adding more qubits to a system made it more accurate, not less. Each time the Google team scaled from a 3×3 grid to 5×5 to 7×7, the logical error rate halved. The resulting logical qubit lasted 2.4 times longer than the best physical qubit on the chip. The paper was published in Nature[1]. It was, in the understated language of the field, a qualitative change.

For the first time, scientists proved that grouping noisy qubits together makes the group more reliable, not less. Before, adding more qubits was like adding cooks to a kitchen and getting worse food. Now the kitchen gets better with more cooks, as long as each cook is skilled enough—the single most important prerequisite for useful quantum computing.

And in Redmond, Washington, Microsoft unveiled Majorana 1[3] in February 2025— an eight-qubit chip built on a “topoconductor,” a new class of material engineered to host Majorana quasiparticles—exotic quantum states that, if they behave as theorized, would naturally resist the errors that plague other qubit types. If the physics holds, topological qubits could offer built-in error protection at a fraction of the overhead required by other architectures. Microsoft’s CEO called it “quantum’s transistor moment.” Physicists at the APS Global Physics Summit, however, were considerably more skeptical, and the scientific debate remains unresolved (see Section 1.1). It may be the most consequential open question in quantum physics today.

Microsoft is pursuing a radically different strategy: instead of correcting errors after they happen, build qubits that inherently resist errors. Think of it as the difference between editing a book after it's written versus using a typewriter that cannot make typos. If it works, it could leapfrog all other approaches—but the scientific community is deeply divided on whether it's been demonstrated.

These three events—Quantinuum's Helios, Google's Willow, Microsoft's Majorana 1—occurred within an eleven-month window spanning late 2024 through late 2025. Together with a cascade of supporting milestones—Princeton's transmon coherence time exceeding one millisecond[4] (fifteen times the industry standard), IBM’s demonstration of all hardware elements needed for fault-tolerant computing, QuEra’s techniques for reducing error correction overhead[6] by up to 100×, and IonQ's reported 12% speedup over classical HPC in a medical device simulation[5] (a contested, unverified result)—they constitute the most concentrated period of progress in the history of quantum computing.

The Thesis

Between 2024 and 2025, quantum computing crossed two thresholds that had stood for decades—below-threshold error correction and near-2:1 physical-to-logical qubit encoding via error detection—and produced a contested early signal of quantum advantage over classical high-performance computing. These are not incremental improvements. They are individually modest in absolute terms, but collectively demonstrate that the fundamental physics works at small scale. The question is shifting from if to when and who—but significant open questions remain about whether the physics continues to cooperate at commercially relevant scale.

The bottom line: for decades, the question was "Does this actually work in practice, or only on paper?" Now it works at small scale. The remaining questions are whether it keeps working at much larger scale, how long it takes to go from "works in a lab" to "solves real-world problems," and whether regular computers will keep closing the gap.

This report is an attempt to answer those questions with precision. It is not a hype piece, and it is not a dismissal. It is a bottoms-up accounting of where quantum computing actually stands—measured in gate fidelity nines, logical qubit counts, error correction overhead ratios, and dollars invested—and where the trendlines point. The analytical method borrows from Leopold Aschenbrenner’s Situational Awareness (June 2024)[7], which decomposed AI progress into measurable axes of improvement to make an extrapolative case about the trajectory of artificial intelligence. The quantum equivalent of “counting the OOMs” (orders of magnitude of effective compute) is counting the nines—tracking each additional nine of gate fidelity (99.9% → 99.99% → 99.999%) because each additional nine exponentially reduces the overhead required for error correction, and therefore exponentially increases the useful computational power of a quantum machine.

An influential AI report tracked one key metric—computing power—and extrapolated forward to predict progress. This report does the same for quantum computing, but the key metric is accuracy. Each improvement in accuracy exponentially reduces the resources needed for useful quantum computation.

Useful Quantum Computing = (Enough Logical Qubits) × (Low Enough Logical Error Rates) × (Fast Enough Clock Speeds) × (Algorithms That Exploit This)

Four things must come together: enough reliable computing units, low enough mistake rates, fast enough operation, and software that can harness the hardware. Missing any one means the computer can't do useful work. Progress on the first three is rapid; software is catching up. This report traces each of these four factors with data, shows where each stands today, shows the trendline for each, identifies the binding constraints, and extrapolates forward.

The Information Asymmetry

Most mainstream coverage of quantum computing oscillates between two poles: uncritical hype (“quantum will break all encryption tomorrow”) and dismissive skepticism (“quantum is always ten years away”). The reality, visible to those tracking the primary literature, is more nuanced and considerably more interesting. The hype is wrong because: current quantum computers cannot break any encryption in use today; the most optimistic estimates for quantum computers powerful enough to break current encryption (so-called "cryptographically relevant" machines) place them at least fifteen years out; many “quantum advantage” claims have been challenged or matched by improved classical algorithms; and quantum machine learning—the application category with the largest projected market value—remains largely theoretical.

The skepticism is wrong because: the pace of error correction progress in 2024–2025 exceeded what the field's own researchers predicted; the physical-to-logical qubit ratio for error detection (not correction) has reached ~2:1 in trapped-ion systems, while fault-tolerant error correction still requires ratios of hundreds-to-one or more; investment has more than doubled year-over-year (on an annualized run-rate basis); DARPA is spending real money[8] evaluating whether utility-scale quantum computing is achievable by 2033; and eleven companies across five different qubit modalities have advanced to the second stage of that evaluation.

The media says either "quantum will change everything next year" (wrong) or "quantum is vaporware" (also wrong). The truth: quantum computers can't do anything commercially useful today and won't break encryption any time soon. But recent progress has genuinely shocked even the optimists, and the government is investing real money to evaluate the technology. This report sits in the gap between hype and dismissal. It follows the data.

Notes

  1. Acharya, R. et al., 'Quantum error correction below the surface code threshold,' Nature 638, 920–926 (2025). [link]
  2. Shor, P., 'Scheme for reducing decoherence in quantum computer memory,' Physical Review A 52, R2493 (1995). The threshold theorem establishing that error correction improves with scale (below a noise threshold) was proven by Aharonov & Ben-Or (1997), Knill, Laflamme & Zurek (1998), and Kitaev (1997). Steane independently developed quantum error-correcting codes in 1996.
  3. Aghaee, M. et al., 'Interferometric single-shot parity measurement in an InAs–Al hybrid devices,' Nature 638, 651–655 (2025). [link]
  4. Princeton University, transmon qubit coherence time exceeding 1 ms via tantalum-on-silicon fabrication (November 2025).
  5. IonQ/Ansys, reported 12% speedup on medical device simulation using 36-qubit Forte Enterprise system (March 2025).
  6. QuEra Computing, algorithmic fault-tolerance techniques for up to 100× error correction overhead reduction (2025).
  7. Aschenbrenner, L., Situational Awareness (June 2024).
  8. DARPA, Quantum Benchmarking Initiative (QBI) Stage A/B announcements (April–November 2025).
  9. Quantinuum, 'Introducing Helios: The Most Accurate Quantum Computer in the World,' quantinuum.com (November 2025).