Chapter 35: Advanced Quantum Cryptography
In Chapter 8 we met the foundational quantum key distribution protocols - BB84 and E91 - which allow two parties to establish a shared secret key with security guaranteed by the laws of physics. But the world of quantum cryptography extends far beyond key distribution. This chapter explores four frontiers: device-independent cryptography, which provides security even when you do not trust your own hardware; quantum digital signatures and quantum money, which exploit the no-cloning theorem for authentication and unforgeable currency; quantum random number generation, which produces truly unpredictable randomness certified by quantum mechanics; and the post-quantum cryptography standards that will protect classical communication against future quantum computers.
We conclude with the practical challenge of migrating the world's cryptographic infrastructure to quantum-safe algorithms - a process that is already underway and will define cybersecurity for decades to come.
You should be familiar with quantum key distribution (Chapter 8), Bell states and Bell inequalities (Chapters 6-7), the no-cloning theorem (Chapter 6), and basic computational complexity (Chapter 1). Some familiarity with classical public-key cryptography (RSA, Diffie-Hellman) will help in Section 35.4.
35.1 Device-Independent Cryptography
Standard QKD protocols like BB84 assume that the devices (photon sources, detectors, beam splitters) behave exactly as specified. In practice, real devices have imperfections that an adversary could exploit. Side-channel attacks - timing variations, detector blinding, Trojan horse attacks on sources - have been demonstrated against commercial QKD systems. Device-independent (DI) cryptography eliminates this entire class of attack by deriving security from the observed statistics of measurement outcomes alone, with no assumptions about the internal workings of the devices.
The Power of Bell Inequality Violations
The foundation of device-independent security is the Bell inequality, specifically the CHSH (Clauser-Horne-Shimony-Holt) inequality. If Alice and Bob each make one of two possible measurements on their respective halves of a shared state, the CHSH quantity $S$ satisfies:
$$S = |E(a_0, b_0) + E(a_0, b_1) + E(a_1, b_0) - E(a_1, b_1)|$$where $E(a_i, b_j)$ is the correlation between Alice's measurement choice $a_i$ and Bob's choice $b_j$. Any local hidden-variable model predicts $S \leq 2$. Quantum mechanics allows a maximum of $S = 2\sqrt{2} \approx 2.828$ (the Tsirelson bound).
The crucial insight for cryptography: if Alice and Bob observe $S > 2$, their measurement outcomes contain genuine quantum randomness that no eavesdropper can predict, regardless of how the devices were manufactured. Even if Eve built the devices herself, the violation of the Bell inequality proves that the outcomes are not predetermined.
Device-independent security is the strongest form of quantum cryptographic security. It requires only two assumptions: (1) quantum mechanics is correct, and (2) Alice's and Bob's laboratories are shielded from information leakage. The devices themselves can be treated as untrusted black boxes.
Device-Independent QKD Protocol
A DI-QKD protocol operates as follows:
- In each round, a source (potentially untrusted) distributes an entangled pair to Alice and Bob.
- Alice randomly chooses measurement $a_0$ or $a_1$; Bob randomly chooses $b_0$ or $b_1$. They record their binary outcomes.
- After many rounds, they publicly announce a randomly chosen subset of their measurement choices and outcomes to estimate the CHSH value $S$.
- If $S > 2$ (with statistical significance), they proceed; otherwise, they abort.
- The remaining (unannounced) rounds where both used designated "key" settings form the raw key. Classical error correction and privacy amplification produce the final secret key.
The key rate (bits of secret key per round) depends on the observed CHSH violation: higher $S$ values yield more key. At the Tsirelson bound $S = 2\sqrt{2}$, the protocol achieves maximum efficiency.
Experimental Progress
DI-QKD is extraordinarily demanding experimentally because it requires a loophole-free Bell test - closing both the locality loophole (Alice and Bob must be space-like separated during measurement) and the detection loophole (detection efficiency must exceed a threshold, typically around 82% for CHSH).
The first loophole-free Bell tests were achieved in 2015 by groups in Delft (NV centers in diamond), NIST (trapped ions), and Vienna (entangled photons). Since then, loophole-free Bell violations have been demonstrated with superconducting circuits (2023) and cavity-QED systems. DI-QKD itself has been demonstrated in proof-of-principle experiments, though key rates remain far below those of standard QKD. Practical DI-QKD at useful rates remains an active research goal.
Semi-Device-Independent and Measurement-Device-Independent QKD
Fully device-independent QKD is the gold standard but is experimentally challenging. Intermediate approaches offer practical advantages:
- Measurement-device-independent (MDI) QKD: Removes all assumptions about the measurement devices (detectors), which are the most vulnerable components. Alice and Bob both send quantum states to an untrusted relay (Charlie) who performs a Bell-state measurement. Security is guaranteed regardless of Charlie's honesty.
- Semi-device-independent QKD: Makes minimal assumptions (e.g., bounding the dimension of the transmitted quantum system) while relaxing the requirements of full DI-QKD.
35.2 Quantum Digital Signatures and Quantum Money
Classical digital signatures rely on computational hardness assumptions (factoring large integers, discrete logarithms) that a quantum computer could break. Quantum mechanics offers alternatives based on physical, rather than computational, security.
Quantum Digital Signatures
A quantum digital signature (QDS) scheme allows a signer to authenticate a message such that any recipient can verify the signature, but no one can forge it. The security rests on the no-cloning theorem: the signer distributes quantum "signature states" to the verifiers in advance. Because these states cannot be perfectly copied, they cannot be forged.
A simplified QDS protocol works as follows:
- Distribution phase: For each possible one-bit message (0 or 1), the signer prepares many copies of a quantum state chosen from a set of non-orthogonal states (e.g., $|0\rangle, |+\rangle$). She sends half to recipient Bob and half to recipient Charlie.
- Messaging phase: To sign a bit $b$, the signer announces $b$ along with the classical description of the states she used for that bit value.
- Verification phase: Recipients use their stored quantum states to check consistency with the announced description. Because non-orthogonal states cannot be perfectly distinguished or cloned, a forger who does not know the signer's choices cannot produce a valid signature.
QDS schemes have been experimentally demonstrated over fiber networks spanning tens of kilometers.
Quantum Money
The concept of quantum money was proposed by Stephen Wiesner in the 1970s (though not published until 1983) - making it one of the earliest ideas in quantum information science. The concept predates quantum computing and even quantum key distribution.
The idea is elegant: a banknote consists of a classical serial number and a sequence of qubits prepared in states chosen from conjugate bases (e.g., $|0\rangle$, $|1\rangle$, $|+\rangle$, $|-\rangle$). The bank keeps a secret record mapping each serial number to the basis choices. To verify a bill, the bank measures each qubit in the correct basis. A genuine bill passes verification with certainty. A counterfeiter who does not know the bases cannot clone the qubits (no-cloning theorem) and any attempt to measure and re-prepare them will introduce errors detectable by the bank.
Quantum money is secure against counterfeiting because of the no-cloning theorem. The original scheme requires the bank to verify each bill (private verification). More recent "public-key quantum money" schemes allow anyone to verify a bill without the bank, though constructing provably secure public-key quantum money from standard assumptions remains an open problem.
Practical Challenges
Quantum money faces severe practical challenges. The qubits on the bill must maintain coherence for the lifetime of the currency - far beyond current quantum memory capabilities. Quantum digital signatures are more practical because the quantum states only need to survive long enough for the protocol to complete (minutes to hours, not months or years). Both technologies are primarily of theoretical interest today, but they illustrate the power of quantum mechanics for cryptographic tasks beyond key distribution.
35.3 Quantum Random Number Generation
Randomness is the lifeblood of cryptography. Every key generation protocol, every nonce, every initialization vector depends on access to unpredictable random bits. Classical random number generators (both hardware and software) produce pseudo-random numbers that are deterministic given sufficient knowledge of the internal state. True randomness - fundamentally unpredictable, not merely computationally unpredictable - requires a physical source. Quantum mechanics provides exactly this: the outcome of measuring a qubit in superposition is genuinely random, with no hidden variable determining the result (as demonstrated by Bell inequality violations).
Basic QRNG: Hadamard-Measure
The simplest quantum random number generator applies a Hadamard gate to $|0\rangle$, creating the superposition $|+\rangle = \frac{1}{\sqrt{2}}(|0\rangle + |1\rangle)$, then measures. The outcome is 0 or 1 with exactly equal probability, and - crucially - this probability is fundamental. No amount of information about the universe's prior state can predict the outcome.
Run the circuit above multiple times. Each of the 16 possible four-bit outcomes ($0000$ through $1111$) should appear with approximately equal probability ($\approx 1/16 \approx 6.25\%$). This flat distribution is the signature of genuine quantum randomness.
Certifiable Randomness
A deeper question: how can you prove that the output of a QRNG is truly random? Perhaps the device is defective and produces biased output, or perhaps an adversary has tampered with it. Certifiable randomness uses Bell inequality violations to prove that the output contains genuine entropy, regardless of device imperfections.
The argument is elegant: if Alice and Bob's measurement outcomes violate a Bell inequality, those outcomes must contain randomness that cannot be predicted by any third party (even one who built the device). This randomness can be extracted using classical post-processing (randomness extractors). The amount of certifiable randomness is quantified by the min-entropy, which is a function of the observed Bell violation.
Several companies (ID Quantique, Toshiba, KETS Quantum Security) sell commercial QRNG devices that produce random bits at rates of hundreds of megabits per second. These typically use photonic implementations: a single photon hitting a beam splitter and being detected on one of two paths. While commercial devices generally use simpler certification methods (not full Bell tests), they provide randomness quality far exceeding classical alternatives.
QRNG Circuit with Tunable Bias
In practice, a QRNG might not produce perfectly balanced output due to device imperfections. The simulation below shows how a rotation angle $\theta$ controls the bias. At $\theta = \pi/2$ (the Hadamard point), the output is perfectly balanced. Deviations from $\pi/2$ introduce bias that a randomness extractor must remove.
When $\theta = 0$, the output is always 0 (no randomness). When $\theta = \pi \approx 3.14$, the output is always 1. At $\theta = \pi/2 \approx 1.57$, you get a perfect 50/50 split - maximum entropy. In a real QRNG, calibration and post-processing ensure the final output is unbiased even if the raw source has slight imperfections.
35.4 Post-Quantum Cryptography Deep Dive
While quantum cryptography uses quantum mechanics for security, post-quantum cryptography (PQC) uses classical algorithms designed to be secure against quantum computers. The distinction is critical: PQC does not require quantum hardware. It runs on today's classical computers and networks, but its security is based on mathematical problems believed to be hard even for quantum computers.
The urgency is real. Shor's algorithm (Chapter 11) can factor integers and compute discrete logarithms in polynomial time on a quantum computer, breaking RSA, DSA, Diffie-Hellman, and elliptic-curve cryptography - the foundations of nearly all current public-key infrastructure. Although large-scale fault-tolerant quantum computers do not yet exist, the "harvest now, decrypt later" threat means that encrypted data captured today could be decrypted in the future.
The NIST Standardization Process
In 2016, the U.S. National Institute of Standards and Technology (NIST) launched a multi-year process to standardize post-quantum cryptographic algorithms. From 82 initial submissions, NIST evaluated candidates across multiple rounds, considering security, performance, and implementation characteristics. In August 2024, NIST published its first three post-quantum cryptography standards:
| Standard | Algorithm | Type | Hardness Basis | Use Case |
|---|---|---|---|---|
| FIPS 203 | ML-KEM (formerly CRYSTALS-Kyber) | Key Encapsulation | Module lattices (M-LWE) | Key exchange, encryption |
| FIPS 204 | ML-DSA (formerly CRYSTALS-Dilithium) | Digital Signature | Module lattices (M-LWE, M-SIS) | Authentication, code signing |
| FIPS 205 | SLH-DSA (formerly SPHINCS+) | Digital Signature | Hash functions | Conservative fallback signatures |
A fourth standard, FIPS 206 (FN-DSA, based on FALCON), was in draft as of late 2024. Additionally, NIST selected HQC (a code-based key encapsulation mechanism) in March 2025 for future standardization, providing diversity in the cryptographic toolkit.
Lattice-Based Cryptography (ML-KEM and ML-DSA)
The two primary NIST standards are both built on lattice problems. A lattice is a regular grid of points in high-dimensional space, defined by a set of basis vectors. Lattice-based cryptography relies on two problems believed to be hard for both classical and quantum computers:
- Learning With Errors (LWE): Given a system of approximate linear equations over a finite field - $\mathbf{b} = \mathbf{A}\mathbf{s} + \mathbf{e}$, where $\mathbf{e}$ is a small error vector - recover the secret $\mathbf{s}$. The module variant (M-LWE) works over polynomial rings for efficiency.
- Short Integer Solution (SIS): Find a short non-zero vector $\mathbf{x}$ such that $\mathbf{A}\mathbf{x} = \mathbf{0} \pmod{q}$. This is related to finding short vectors in lattices.
ML-KEM (FIPS 203) is a key encapsulation mechanism. Alice generates a public-private key pair based on a lattice. Bob encrypts a random session key using Alice's public key, producing a ciphertext that only Alice can decrypt with her private key. The security relies on M-LWE: without the private key, recovering the session key requires solving a hard lattice problem.
Key sizes are larger than classical counterparts (public keys around 800-1,568 bytes depending on security level) but far smaller than other PQC families. Encryption and decryption are fast - comparable to or faster than RSA.
ML-DSA (FIPS 204) is a digital signature scheme. The signer generates a key pair from a lattice. Signing a message requires finding a short lattice vector related to the message hash, which is easy with the private key but hard without it. Verification checks that the signature is a valid short vector, which is efficient. Signature sizes range from about 2.4 to 4.6 KB depending on security level.
Hash-Based Signatures (SLH-DSA)
SLH-DSA (FIPS 205, formerly SPHINCS+) takes a fundamentally different approach. Its security rests solely on the properties of cryptographic hash functions - specifically, the difficulty of finding preimages and second preimages. Hash functions are among the most well-studied cryptographic primitives, and their security against quantum attacks is well understood: Grover's algorithm provides at most a quadratic speedup, which is countered by doubling the hash output length.
SLH-DSA uses a hypertree structure: a tree of Merkle trees, where each leaf is a one-time signature (based on WOTS+, a Winternitz scheme). The hypertree allows signing a large number of messages with a single public key. The trade-off is size: signatures are 7-50 KB (much larger than lattice-based signatures), and signing is slower. However, SLH-DSA serves as a critical conservative backup - if lattice problems turn out to be easier than believed (due to new mathematical insights or quantum algorithms), hash-based signatures remain secure.
Code-Based Cryptography
Code-based cryptography relies on the difficulty of decoding random linear error-correcting codes - a problem that has been studied since McEliece proposed his cryptosystem in 1978. NIST selected HQC (Hamming Quasi-Cyclic) in March 2025 for future standardization as a key encapsulation mechanism. HQC provides an alternative to lattice-based key exchange, ensuring that the post-quantum toolkit is not dependent on a single mathematical assumption.
NIST deliberately standardized algorithms based on different mathematical problems (lattices, hash functions, codes) so that a breakthrough against one family does not compromise the entire PQC infrastructure. This diversification strategy mirrors good engineering practice: never rely on a single point of failure.
Comparison of PQC Families
| Family | Hardness Assumption | Key Size | Signature/Ciphertext Size | Speed | Maturity |
|---|---|---|---|---|---|
| Lattice | LWE / SIS | Small-medium | Small-medium | Fast | NIST standardized (2024) |
| Hash-based | Hash preimage resistance | Small | Large | Moderate | NIST standardized (2024) |
| Code-based | Decoding random linear codes | Large | Medium | Fast | NIST selected (2025) |
| Isogeny-based | Isogenies of elliptic curves | Very small | Very small | Slow | SIKE broken (2022) |
| Multivariate | Solving multivariate polynomials | Large | Small | Variable | Research stage |
Post-quantum cryptography and quantum cryptography are not the same thing. PQC uses classical algorithms on classical computers to resist quantum attacks. Quantum cryptography (QKD, quantum money, etc.) uses quantum mechanics directly for security. Both will likely coexist in future security architectures: PQC for software/network compatibility, QKD for the highest-security links.
35.5 The Migration Challenge
Knowing which algorithms to use is only half the battle. Migrating the world's cryptographic infrastructure from classical to post-quantum algorithms is an enormous practical challenge - one that NIST has called "one of the most significant transitions in the history of cryptography."
The "Harvest Now, Decrypt Later" Threat
An adversary can record encrypted communications today and store them until a sufficiently powerful quantum computer becomes available, then decrypt them retroactively. For data that must remain confidential for decades (government secrets, medical records, financial data, intellectual property), the migration to PQC must happen before quantum computers arrive - not after.
This is not a theoretical concern. Intelligence agencies are widely believed to be stockpiling encrypted traffic today. NIST's draft IR 8547 (2024) recommends that organizations begin transitioning to PQC immediately, with a target of deprecating vulnerable algorithms by 2030 and disallowing them by 2035.
The urgency of PQC migration is determined by the formula: if the time to migrate ($T_{\text{migrate}}$) plus the time data must remain secret ($T_{\text{secret}}$) exceeds the time until a cryptographically relevant quantum computer exists ($T_{\text{quantum}}$), then migration should have already begun. For long-lived secrets, $T_{\text{migrate}} + T_{\text{secret}} > T_{\text{quantum}}$ is plausibly already true.
Technical Challenges
The migration involves far more than swapping out algorithms:
- Larger key and signature sizes. PQC keys and signatures are significantly larger than their classical counterparts. ML-KEM public keys are 800-1,568 bytes vs. 32 bytes for X25519. ML-DSA signatures are 2.4-4.6 KB vs. 64 bytes for Ed25519. This affects bandwidth, storage, and protocol message sizes.
- Protocol modifications. TLS, SSH, IPsec, S/MIME, and other protocols must be updated to support PQC algorithm negotiation, larger handshake messages, and hybrid modes.
- Hybrid mode. During the transition, many deployments will use hybrid key exchange - combining a classical algorithm (e.g., X25519) with a PQC algorithm (e.g., ML-KEM) so that the connection is secure even if one algorithm is broken. Chrome, Firefox, and Cloudflare have already deployed hybrid key exchange (X25519 + ML-KEM) in TLS.
- Embedded and IoT devices. Devices with limited memory, processing power, and bandwidth (smart cards, sensors, industrial controllers) may struggle with the larger footprints of PQC algorithms.
- Cryptographic agility. Systems should be designed so that algorithms can be swapped without replacing the entire system - a property called "crypto agility" that many legacy systems lack.
- Validation and certification. Every implementation must be validated against the standards (NIST's CMVP program), tested for side-channel resistance, and certified for use in regulated environments.
The Road Ahead
The PQC migration will take years, if not decades, to complete fully. The recommended approach is phased:
- Inventory: Catalog all cryptographic assets - where RSA, ECDH, and ECDSA are used, which data has long-term confidentiality requirements.
- Prioritize: Migrate highest-risk systems first (key exchange for long-lived secrets, signatures on firmware updates).
- Hybrid deployment: Use hybrid classical + PQC schemes during the transition to ensure security against both classical and quantum adversaries.
- Full migration: Once PQC implementations are mature and validated, remove classical algorithms entirely.
Quantum key distribution provides information-theoretic security (Chapter 8) but requires dedicated quantum hardware and channels. PQC provides computational security but works on existing classical infrastructure. A defense-in-depth strategy uses both: PQC for general-purpose communication, and QKD for the most sensitive links where the investment in quantum infrastructure is justified.