owned this note
owned this note
Published
Linked with GitHub
# Thinking about Quantum x Blockchains
Note: This hasn't been reviewed by an expert, and is just from me skimming through papers. There is likely some error here in my very simplified mental models. However, I think this model is more complete than most non-academic content online about this topic.
## What are the powers of a quantum adversary?
- There are a couple key algorithms here, including Shors and Grovers. The main thing that they can do is **prime factorize and take discrete log**. They cannot help undo hashes (as far as we know).
- Specifically, given a public key, they can derive the private key. This is what leads breaking back-secrecy of any [deterministic signatures scheme on ecdsa](https://docs.google.com/document/d/1Q9nUNGaeiKoZYAiN9ndh4iE9e-_ql-Rf7MESTo7UB8s/edit).
## What happens to blockchains?
- For Bitcoin (and Ethereum), addresses need to have at least one public signature for people to know the public key that corresponds to their address (usually `address = keccak_hash(pk)[0:40]`). Bitcoin is secure, because UTXOs can simply act as one-time-use accounts, and spend all the money -- even if someone can derive the secret key, they will not be able to spend past UTXOs.
- Ethereum can easily transition to a secure keypair set because you can merely have all accounts sign the public key of their new account and submit it to, say, a migration smart contract, which will then hardfork to move everyones Eth to the more secure keypair set. Smart contracts do not have public keys, only addresses (recall that even a uquantum computer cannot undo that hash), so funds are safu.
## What parts of zero knowledge exactly are broken?
- tl;dr almost nothing.
- There is a key distinction between statistical and computational zero knowledge (and perfect zk, but that's impractical) -- statistical zero knowledge means that no infinite compute verifier can distinguish between distributions, computational means that no polynomial verifier can distinguish between distributions.
- groth16 (and most proof systems we know in production right now) are statistical zk: [paper](https://eprint.iacr.org/2016/260.pdf). This means that even a quantum adversary with access to several past proofs, cannot break past zero knowledge or uncover your secret information.
- However, because they can take discrete log, they can derive the toxic waste from just the public signals of any trusted setup ceremony. Thus, they can fake any ZK-SNARK -- we expect that any current verifier deployed on-chain would have time to migrate to a quantum-resistant proof system prior to this scheme being live. - Similarly, they can derive the discrete logs of the signals used to make IPA commitments hiding, and thus break hiding on IPA commitments. STARKs are still secure though, since they rely on hashing.
- In fact, this can be generalized -- the reason quantum breaks soundness but not secrecy is that there is a fundamental tradeoff here with zk vs soundness of proofs: this fairly short [paper](https://www.cs.cmu.edu/~goyal/ConSZK.pdf) proves you can either have statistical zero knowledge or statistical soundness, but not both.
## What is going on with annealing vs qubit computers, the different quantum computing paradigms?
- There are two major quantum computing paradigms: quantum annealing and quantum computers.
- Quantum annealing involves analog superposition across all of the qubits, which slowly 'anneals' to an approximate solution.
- Pure quantum computers have superposition across only across small sets of qubits, that comprise quickly-changing discrete gates, but can thus calculate across all the qubits and have intermediate error correction.
- It's a lot easier to get impressive-seeming qubit counts like 5000 on quantum annealing computers (DWAVE for instance), but they require far more bits for the same task, are usually less efficient, and are too noisy for complex calculations.
- Pure quantum computers are the ones where you've heard excitement over recently factored numbers like 15 and 35, and these have huge problems with noise (and some think an existential upper bound on the number of qubits due to the noise).
## What do different algorithms like factorization, discrete log, or un-hashing look like on quantum computers?
- Annealing bounds:
- Quantum annealing can minimize funnctions. For instance, one way to solve prime factorization is to minimize `(n - pq)` over the bits of n, p, and q. This takes about $\frac14 \log^2(n)$ qubits to prime factorize n, but also takes time exponential in n, and is unfeasible for large values: [2018 paper](https://arxiv.org/pdf/1804.02733.pdf).
- Discrete log to factorize n (with log(n) bits), from a [2021 paper](https://link.springer.com/chapter/10.1007/978-3-030-89432-0_8) shows about $2\log^2(n)$ qubits needed on annealing based systems, although they ran into practical connectivity issues past n = 6 bits.
- In fact, it's likely that bigger discrete log is impossible: this [2013 paper](https://arxiv.org/pdf/1307.5893.pdf) shows that the Hamiltonian may make it impossible to convert physical qubits to logical qubits.
- Actual quantum computer bounds:
- The bound for simple prime field discrete log is around $3n + 0.002n \log n$ where n is the number of bits (n=256 for us): [2021 paper](https://arxiv.org/pdf/1905.09749.pdf) -- without considering noise overhead. With noise, they calculate that n = 2048 bit discete log will take 20 million physical qubits.
- Newer algorithms have shown that elliptic curve discrete log on a curve like secp256k1 is a bit harder, closer to $9n$: [2017 paper](https://eprint.iacr.org/2017/598). Past bounds closer to $6n$ don't explicitly describe how to do arithmetic on elliptic curves and merely provided a lower bound [2008 paper](https://arxiv.org/pdf/quant-ph/0301141.pdf).
- Again, these are numbers for signal qubits without noise, and noise qubits add several orders of magnitude more qubits than this, so perhaps these initial estimations are not even relevant -- perhaps one should even omit the constant factors with asymptotic notation here to better communicate that.
- Intuitively, why is a hash function hard for any quantum computer?
- If you write a hash function as a polynomial in the bits of the input, the resulting function has a degree that is far too high for a quantum adversary to reverse. Specifically, root finding on standard quantum computers takes $O(n \log(n))$ time on $\log (n)$ qubits, where n is the degree of the polynomial, [2015 paper](https://arxiv.org/pdf/1510.04452.pdf). While the qubit count may be within imagination, this time is absolutely infeasible (degrees of hash functions expressed as polynomials are on the order of ~$2^{16000}$ or more). Future specific quantum algorithms might provide small improvements, but we are still many orders of magnitude off.
<!-- ## Is there promising research that would massively improve these estimates in the next n years?
- Unclear. There is work that tries to measure the qubit as it is collapsing, there is always work on better and smaller algorithms, and IBM is constantly pumping out interesting improvements. -->
## What is a reasonable timeline to expect ECDSA on secp256k1 to be broken?
- It seems that expert consensus varies from 2040-never (if the theoretical noise problem is never overcome). If the noise problem is overcome, I still expect 2100 to be when we see this: it may take longer than expected to get there, because of the "valley of death" due to a dearth applications between a few dozen qubits and a few hundred thousand. There is utility on the small end for theoreticians, and utility on the high end for cryptography, but very little intermediate use for qubit counts in the middle, and thus makes ROI for funding much worse.
- IBM has been surprisingly accurate on [it's timeline](https://research.ibm.com/blog/ibm-quantum-roadmap-2025) for qubit computers -- again, these are signal + noise qubits, so the actual signal qubit count is substantially less than the number you see, though the extent to which this is the case depends on the specific algorithm.
This is a very rapidly changing field, so these results will likely update year after year.
[Account](https://crypto.stackexchange.com/users/101665/john-targaryen?tab=activity) with stack overflow questions/comments.