Chapter 39: Open Problems in Quantum Computing

Quantum computing is a field defined by its open problems. Despite decades of progress, fundamental questions remain unanswered - questions about physics, engineering, complexity theory, and even what quantum computers are ultimately for. This chapter confronts these questions honestly. Understanding what we do not know is just as important as understanding what we do know, and the open problems of today define the research frontiers of tomorrow.

We organize the discussion around five themes: the reality of quantum advantage, the engineering challenge of scaling, the timeline to fault tolerance, connections to fundamental physics, and the human challenge of building a workforce for a technology that does not yet fully exist.

39.1 Is Quantum Advantage Real and Useful?

In 2019, Google's Sycamore processor performed a random circuit sampling task in 200 seconds that would have taken the best classical supercomputer an estimated 10,000 years. This was hailed as quantum supremacy - proof that a quantum computer could outperform any classical computer on some task. Since then, the bar has risen. In October 2025, Google demonstrated with their Quantum Echoes algorithm that their Willow processor was approximately 13,000 times faster than the world's fastest classical supercomputer on a verifiable benchmark.

But there is a crucial distinction between quantum supremacy (solving any problem faster than classical computers) and useful quantum advantage (solving a practical problem faster). Random circuit sampling has no known real-world application. The open question is:

Open Problem 1: Useful Quantum Advantage. Can a quantum computer solve a commercially or scientifically important problem significantly faster than the best classical alternative, including all overhead costs of error correction, compilation, and data loading?

As of early 2026, the honest answer is: not yet, but the gap is closing. Several groups have reported narrow advantages on specific problems:

  • IonQ and Ansys demonstrated a medical device simulation where a 36-qubit quantum computer outperformed classical HPC by approximately 12% - one of the first documented cases of practical quantum advantage, albeit on a narrow problem.
  • Q-CTRL achieved quantum advantage in GPS-denied navigation using quantum sensors, outperforming classical alternatives by 50x or more.
  • Multiple groups have shown quantum advantages in chemistry simulation for small molecules, though these have not yet scaled to molecules too large for classical methods.

The skeptic's response: these advantages are either on narrow problems, small instances, or sensing (not computing) tasks. No end-to-end quantum application has yet been demonstrated with a conclusive advantage on a problem of broad real-world consequence. The optimist's response: we are where classical computing was in the 1950s - the transistor existed, but no one had yet imagined the applications that would transform the world.

The Moving Target Problem

A recurring challenge is that classical algorithms keep improving. When a quantum algorithm claims an advantage, classical researchers are motivated to find better classical methods. This happened with quantum recommendation systems (dequantized by Tang), quantum simulation (classical tensor network methods keep advancing), and even random circuit sampling (IBM showed classical simulations were faster than initially claimed). The advantage target is always moving.


39.2 The Great Scaling Challenge

The most important open engineering problem in quantum computing is scaling. Current quantum processors have tens to a few thousand physical qubits. Running useful fault-tolerant algorithms (like breaking RSA-2048 or simulating industrially relevant molecules) requires millions of physical qubits. The gap between where we are and where we need to be spans roughly three orders of magnitude.

Qubit Scaling Gap: Current vs Required

The Error Rate Bottleneck

The fundamental issue is not just qubit count but error rates. Current physical qubits have error rates in the range of $10^{-3}$ to $10^{-4}$ per gate operation. Fault-tolerant computation requires logical error rates of $10^{-10}$ or better. Bridging this gap through quantum error correction requires enormous overhead: roughly 1,000 to 10,000 physical qubits per logical qubit with current codes and error rates.

$$\text{Physical qubits needed} \approx \text{Logical qubits} \times \frac{1}{\text{(code threshold} - \text{physical error rate)}^2}$$

This is why "number of qubits" headlines can be misleading. A 1,000-qubit processor with high error rates may be less useful than a 100-qubit processor with lower error rates. The relevant metric is not raw qubit count but logical qubit count at a target logical error rate.

Beyond Qubit Count: The Full Stack Challenge

Scaling is not just about making more qubits. It requires simultaneous advances across the entire stack:

  • Classical control electronics: Each qubit requires precise control signals. Scaling to millions of qubits requires a revolution in cryogenic electronics and signal routing.
  • Interconnects: For modular architectures (linking multiple quantum chips), high-fidelity quantum interconnects are needed. Photonic interconnects and microwave-to-optical transducers are active research areas.
  • Cooling: Superconducting and topological qubits operate at millikelvin temperatures. Current dilution refrigerators cool small areas; million-qubit systems may require fundamentally new cooling approaches.
  • Compilation: Translating high-level quantum algorithms into hardware-native gate sequences that respect connectivity constraints and minimize depth is an NP-hard problem in general. Scalable approximate compilers are needed.
  • Real-time decoding: Quantum error correction requires decoding syndrome data faster than errors accumulate - within microseconds for superconducting qubits. This is a significant classical computing challenge at scale.
Open Problem 2: The Scaling Wall. Can any qubit technology scale to millions of physical qubits while maintaining or improving per-qubit error rates, within a system that is economically viable to build and operate?

39.3 When Will Fault-Tolerant QC Arrive?

The most common question from outside the field is: when will quantum computers actually work? The answer depends on what "work" means, but the most meaningful milestone is fault-tolerant quantum computing (FTQC) - a quantum computer that can run arbitrarily long computations with arbitrary precision, using quantum error correction to suppress errors below any desired threshold.

Industry Roadmaps

The major quantum computing companies have published roadmaps with specific milestones:

IBM has the most detailed public roadmap. By 2029, IBM aims to deliver Quantum Starling - a large-scale fault-tolerant quantum computer capable of running circuits with 100 million quantum gates on 200 logical qubits. The system is being built at IBM's facility in Poughkeepsie, New York. IBM has also targeted quantum advantage by end of 2026.

Quantinuum (the merged Honeywell Quantum Solutions and Cambridge Quantum Computing) has unveiled an accelerated roadmap targeting fully fault-tolerant, universal quantum computing by approximately 2030. Their "Apollo" system will scale from earlier-generation ion traps to thousands of physical qubits delivering hundreds of logical qubits. In September 2024, Microsoft and Quantinuum demonstrated 12 logical qubits on the H2 system. By late 2025, Quantinuum launched Helios with 98 physical qubits delivering 48 logical qubits at a nearly 2:1 encoding ratio.

QuEra (neutral atoms) marked 2025 as their "year of fault tolerance," demonstrating a 3,000-qubit array operating continuously for over two hours. Their researchers achieved below-threshold error correction performance using up to 96 logical qubits, and they plan to release a 30-logical-qubit machine with magic state distillation. QuEra raised over $230 million in 2025, led by Google Quantum AI and SoftBank, with strategic investment from NVIDIA.

Google demonstrated on their Willow processor that error rates decrease exponentially with code distance (the key threshold result), and is pursuing a roadmap toward a "useful, error-corrected quantum computer" within the decade.

Microsoft, as discussed in Chapter 37, is pursuing the topological approach with Majorana 1, targeting a fault-tolerant prototype through DARPA's US2QC program "in years, not decades."

A Realistic Timeline

Reading between the lines of these roadmaps, the consensus among experts (as of 2026) is roughly:

  • 2025-2027: Demonstrations of small-scale fault-tolerant systems (tens of logical qubits) that can execute simple algorithms with error correction. We are currently in this phase.
  • 2028-2032: Intermediate-scale fault-tolerant systems (hundreds of logical qubits) that can run quantum algorithms offering advantage on some practical problems. This is the "utility era."
  • 2033+: Large-scale fault-tolerant systems (thousands of logical qubits) that can tackle the flagship applications: breaking cryptography, simulating complex molecules, solving large optimization problems.
Caveat. Every quantum computing timeline in history has been optimistic. In 2000, experts predicted fault-tolerant QC by 2020. In 2015, predictions shifted to "5-10 years." In 2025, the target is again "5-10 years." This persistent optimism should be tempered with the understanding that unforeseen engineering challenges may extend these timelines. The physics works; the question is whether the engineering can be made to work at scale and at reasonable cost.
Industry Roadmap: Fault-Tolerant QC Milestones

39.4 QC and Fundamental Physics

Quantum computing is not just a technology project - it is deeply connected to fundamental questions about the nature of computation and physical reality.

The Extended Church-Turing Thesis

The classical Church-Turing thesis states that any effectively computable function can be computed by a Turing machine. The extended Church-Turing thesis adds that a Turing machine can efficiently simulate any physical process. If quantum computers provide exponential speedups for some problems (as we believe for factoring and simulation), the extended Church-Turing thesis is false. This would be a profound statement about the computational power embedded in the laws of physics.

Open Problem 3: Quantum Computational Supremacy (Formal). Is BQP (the class of problems efficiently solvable by quantum computers) strictly larger than BPP (the class efficiently solvable by classical computers)? Proving BQP $\neq$ BPP would be a major result in computational complexity theory. It remains unproven, though widely believed to be true.

Quantum Gravity and Black Holes

Perhaps the most surprising open problem is the connection between quantum computing and quantum gravity. The AdS/CFT correspondence (a conjecture in theoretical physics relating gravity in anti-de Sitter space to a quantum field theory on its boundary) suggests that spacetime itself may be an emergent phenomenon arising from quantum entanglement. Tensor networks - the same mathematical objects used in quantum error correction and quantum simulation - appear naturally in the structure of this correspondence.

The black hole information paradox asks whether information that falls into a black hole is truly lost. Recent theoretical work (notably the "island formula" and related results) suggests that quantum error correction plays a fundamental role in how information is preserved in black hole physics. The Hayden-Preskill protocol shows that if a black hole's interior radiation is entangled with its exterior, an observer with a quantum computer could reconstruct the information.

These connections suggest that building quantum computers is not just an engineering goal but a way to probe the deepest structure of physical reality. A large-scale quantum computer capable of simulating quantum gravity could test ideas about the nature of spacetime, the emergence of gravity from entanglement, and the resolution of the information paradox.

Quantum Biology

Another frontier is quantum biology - the question of whether living systems exploit quantum effects for functional purposes. Quantum coherence has been observed in photosynthetic light-harvesting complexes, bird navigation may involve radical pair mechanisms (a quantum chemical effect), and enzyme catalysis may exploit quantum tunneling. These are all systems that quantum computers could simulate far more effectively than classical computers, potentially resolving decades-old debates.


39.5 The Talent Gap

The quantum computing industry faces a workforce challenge that could slow progress as much as any technical obstacle. Building quantum computers requires expertise spanning physics, engineering, computer science, and mathematics - a rare combination of skills that traditional educational programs do not produce in sufficient numbers.

The Numbers

Industry forecasts project approximately 250,000 quantum roles by 2030 and up to 840,000 by 2035. The current global quantum workforce is estimated at roughly 30,000 people. For every qualified quantum professional, there are approximately three open positions. The MIT Quantum Index Report 2025 found that the share of US job postings requiring quantum skills nearly tripled from 2011 to mid-2024.

Yet the pipeline is narrow. A survey found that only 12% of respondents had formal training in quantum-relevant areas. PhD programs in quantum information science produce a few hundred graduates per year globally. And many of those graduates are recruited by a small number of well-funded companies, leaving startups, national labs, and universities struggling to staff their programs.

Addressing the Gap

Several initiatives are working to close the talent gap:

  • Undergraduate programs: Universities are introducing quantum information science courses at the undergraduate level, reaching students before they specialize.
  • Bootcamps and certifications: IBM, Google, and others offer quantum computing certification programs accessible without a physics PhD.
  • Open-source community: Contributions to open-source quantum software (Qiskit, Cirq, PennyLane) provide an alternative path to building quantum expertise.
  • Cross-training: Classical software engineers, hardware engineers, and data scientists can transition into quantum roles with targeted training, bringing valuable skills from adjacent fields.
Open Problem 4: Scaling the Workforce. The quantum computing industry needs to grow its workforce by roughly 10x within a decade. This requires not just more PhD programs but new educational models that produce "quantum-ready" engineers, developers, and researchers at the undergraduate and professional levels. The field that solves its talent problem first may win the quantum race.