So you wanna Post-Quantum Ethereum transaction signature

So you wanna Post-Quantum Ethereum transaction signature

Thanks to Vitalik Buterin, Justin Drake, Renaud Dubois, Marius Van Der Wijden and Zhenfei Zhang for fruitfull discussions.

Introduction

2024 will probably be remembered as one of the years marking the acceleration of the quantum computer menace. Google, under its CEO Sundar Pichai, finally unveiled its quantum chip, Willow, via a loud tweet!

Scott Aaronson, one of the most famous quantum experts in the world, has changed his message to people asking whether they should be worried about quantum computers. He shifted from saying

… Maybe, eventually, someone will need to start thinking about migrating from RSA, Diffie-Hellman, and elliptic curve cryptography to lattice-based crypto or other systems that could plausibly withstand quantum attacks,…

to

Yes, unequivocally, worry about this now. Have a plan.’

Vitalik has already written about how to hard-fork to save most users’ funds in a quantum emergency. Also, few days ago, he highlighted in a podcast the four main Ethereum components potentially vulnerable to quantum attacks. They are:

  1. Ethereum transaction signatures (notably using ECDSA)
  2. BLS signatures in consensus
  3. Data Availability Sampling (leveraging KZG commitments)
  4. Verkle trees (if shipped with Bandersnatch)

An attentive reader might have noticed that these four points have something in common—yes, it’s my beloved elliptic curves. Unfortunately, the discrete logarithm problem for elliptic curves (ECDLP) is broken by Shor’s Algorithm, a famous quantum algorithm.

In this short note, we are going to analyze a possible post-quantum replacement for the first point, namely a potential post-quantum Ethereum transaction signature.

Which PQ signature?

Now, a legitimate question is: which post-quantum (PQ) signatures should we use? Fortunately, we don’t need to overthink this too much if we had to choose right now. Zhenfei Zhang, a former Ethereum Foundation cryptographer, has already written about the NIST Post-Quantum Cryptography Standardization Process. If we analyze the three possible signature choices (two of which leverage lattice-based cryptography), it’s clear (at least for now) that Falcon appears to be the most promising candidate. The computation for the verifier should be roughly the same as other lattice-based signature schemes (like Dilithium), i.e., bounded by an FFT. However, Falcon does have a smaller signature size.

Ship it!!!

Now that we’ve ‘settled’ on the signature to use, the next question is: how are we going to ship it? There is a big dichotomy now: one implies a hard fork, and the other doesn’t. Let’s dig a bit deeper.

The Account Abstraction way

The first approach we will discuss, arguably the most elegant and promising, involves Account Abstraction (AA). It has been advocated by Justin Drake and Vitalik on various occasions.

For people not familiar with it, AA is a proposed improvement to make the Ethereum ecosystem more flexible and user-friendly by changing how transactions and accounts are managed. It shifts certain functionalities traditionally reserved for externally owned accounts (EOAs) into smart contracts, effectively “abstracting” the differences between EOAs and smart contract accounts.
Ethereum developers have introduced various proposals for implementing AA, including ERC-4337. This is a practical solution that achieves AA without requiring a consensus-layer upgrade. It uses a mechanism called User Operation objects and introduces a separate Bundler layer to handle transactions.

Adding Falcon as the Ethereum transaction signature in this scenario means coding a Falcon verifier contract that is responsible for verifying the validity of User Operation objects before they are executed by the Entry Point contract.
Now, this may sound like all sunshine and rainbows, but there is at least one substantial underlying issue. Coding Falcon in Solidity might not be the best experience (and it’s probably quite gas-costly). On top of that, there are even nastier problems, such as the fact that Falcon deals with 13-bit numbers, while Solidity only supports U256. The latter is the kind of issue that could be addressed by adding SIMD and EVMMAX to the EVM.

  • Pros: It is an elegant and flexible solution.
  • Cons: It is costly in terms of gas consumption.

The hard fork way

The method we discuss here is probably the simplest technically. It is inspired by previous work done by Marius Van Der Wijden and essentially involves introducing a new transaction type signed with Falcon signatures instead of BLS signatures. The biggest problem here is that, by doing so, we are tightly bound (through a new EIP) to a favored master signature scheme.
So, to recap this approach

  • Pros: Easy to code and fast.
  • Cons: Not future-proof.

Hybrid

A really tempting approach would be to take the best of the two methods above and combine them into a single one. In a nutshell, we could leverage AA in a similar way that RIP-7212 does, but of course, we would need a new RIP for Falcon. This might provide the time to experiment with the feature in rollups and determine if Falcon is truly the way to go. However, it is important to note that this approach does not solve the original problem of introducing a new signature scheme at the L1 level.

  • Pros: Easy to code and fast.
  • Cons: Temporary (does not solve the L1 use case).

Conclusion

The rise of quantum computing demands urgent action to secure Ethereum, particularly its transaction signatures vulnerable to Shor’s Algorithm. Falcon, a lattice-based signature scheme, emerges as a strong candidate due to its efficiency and compact size. Deployment strategies, including Account Abstraction, hard forks, or a hybrid approach, each offer distinct benefits and trade-offs. A careful evaluation is essential to ensure Ethereum remains robust against quantum threats while maintaining scalability and usability.

14 Likes

What do you mean with ‘it is costly in terms of gas consumption’ in the second method (hard fork)?

In any case replacing ECDSA with falcon will work for simple transactions I guess, BLS transactions are nice but not having them is probably not the end of the world. Do you guys have any clue about how to replace BLS at consensus level tho? That, to me, seems the hardest problem of the four you listed so far!

4 Likes

Extra data point: the Algorand blockchain has been using Falcon signatures since ~2022:

1 Like

I am given to understand that Winternitz is being considered as a replacement for BLS in beam chain.

2 Likes

What do you mean with ‘it is costly in terms of gas consumption’ in the second method (hard fork)?

Thanks for your comment it was indeed a copy cargo issue. Fixed to

  • Pros: Easy to code and fast.
  • Cons: Not future-proof.

how to replace BLS at consensus level tho

yes but this is going to be discussed elsewhere

1 Like

That’s a concise and clear sum up.

Another possibility would be to introduce NTT (the name cryptographers give to FFT specialized to prime fields) rather than a specific algorithm as new EIP. NTT is the core operation both for STARK verifiers, and lattice candidates. What is called “zk-friendly” being in fact the capacity to multiply polynomials efficiently. All the previous reasoning (progressive precompile, Hard fork or RIP) being the same for this object.

Having a fast and generic NTT would benefit both ZKEVMs and Lattice based signatures.

2 Likes

This is a nice option indeed. I like it. I wonder if EVMMAX could also be an alternative also here

If you were to build a contract to verify ECDSA with only bigint primitives the gas price would be exhorbitant too, the cost of verification is heavily discounted when included as opcode.

Also we did deploy a smart contract Lamport wallet using account abstraction, the verification cost is actually very decent for one-time hash-based schemes.

2 Likes

it’s clear (at least for now) that Falcon appears to be the most promising candidate

How is it clear? Why? To me Sphincs feels safer. Which parts of eth are not suitable for 32KB sigs (as opposed to 1KB falcon sigs)?

1 Like

To me Sphincs feels safer.

At the end of the day, Falcon is also one of the standards. That said, especially if we follow the AA approach, we do not need to limit or constrain ourselves to a single possible signature.

2 Likes

Curious on how easy is it to prove inside of a ZK circuit. Mainly because at some point we may want to snarkify the whole ethereum. And there, this might become our archenemy as keccak was 4 years ago when all the ZKEVM work started.

Also, thanks for the detailed post @asanso <3

Having a fast and generic NTT would benefit both ZKEVMs and Lattice based signatures.
Unsure this is the only bottleneck. Although it’s a cool idea. The problem with these opcodes is that:

  1. They should exist within L2s as experimentation. Not in mainnet IMO.
  2. These opcodes are hard to price. As they’re variadic length and ammortize depending on it on different ways. They also depend on the underlying hardware (namely cores or parallelization capabilities.)
  3. One could say why not FFT? And then we yet again open Pandora’s vault.
2 Likes

The SIMD EIP you referred actually potentially provides a potential better speed up. EVMMAX has been built with 256-384 bits ECC constraints in mind. This SIMD actually looks like a ‘RISC-VM’-lite and might be a good compromise between eWASM/RISCVM (large complexity but generic) and the specificity of EVMMAX.

It could be rewritten with a ARM vectorized types-like notations, which are very intuitive: uintAxB being handling B values of size A in parallel. Then any operation opAxB(u1,u2) is the parallel application of operator “op” to B elements of size A.

2 Likes

Random walks if/where possible. without reading, the above context, just planting a seed in your minds

Even though I agree we should be alert, there is no real urgency in making a decision. Quantum computers are still ten years away, as physicists have been saying ever since Shor published his algorithm back in 1994.

So instead of choosing between Falcon and Sphincs, maybe we should await the results of the additional NIST call for PQ signatures. I find some of the underlying security assumptions more convincing, anyway we would have a wider range of options to choose from.

1 Like

I’ve gone back and forth on this. On one hand we need to sample elements using a gaussian distribution, but on the other this sampling should only occur at witness generation time, so floating point math shouldn’t be a problem. Actually if we’re proving execution of an ecsign type opcode in the evm we would have to prove gaussian sampling :confused:

The lattice constructions should be really easy to prove provided we have wrong field math. iiuc they’re all purely (linear) algebraic.

Instead of NTT I think we should do accelerated finite field math as opcodes. We can implement an efficient NTT as long as we have cheap reductions/representations. Most/all cryptographic constructions use modular math at their base.

But I think we can get pretty far without evmmax by compiling math directly to Yul, especially with small field stuff. We can use lazy modular reduction + the existing mod opcode to get fairly cheap NTT’s. We’ve got 256 bits on every integer for free, may as well use it.

That said, the problem we have with most of this isn’t math/opcode related, it’s engineering. There’s no cross platform frontend for expressing/compiling programs in this way. Expressing such programs in Solidity is impossible because control flow dominates the gas cost.

2 Likes

In January, we plan to release an open-source library that implements the verification algorithm of several post-quantum secure digital signature schemes in Solidity. In the paper, we evaluate the gas costs of these verification algorithms using two verification modes: on-chain verification, where the entire signature scheme is verified on-chain, and using another, optimistic verification mode called Naysayer verification. In this verification mode, on-chain, only a Merkle commitment \mathsf{com} to the signature \sigma is stored. We assume that there exists a data availability layer where the full signature is available. Subsequently, so-called, naysayers can send on-chain before a pre-defined time-out short proofs proving that the on-chain signature is incorrect. Proving the incorrectness of these large PQ signatures is typically much faster than verifying the entire signature. In the optimistic case, the on-chain smart contract does not do anything; it just stores the commitment \mathsf{com} in storage, though that storage slot, after the challenge period, can be overwritten.

Here are some preliminary results, just to give you an idea about the gas costs of verifying on-chain PQ signatures. MAYO is not ready yet, but it seems, it will be less efficient than the hash-based signature schemes.

1 Like