The road to Post-Quantum Ethereum transaction is paved with Account Abstraction (AA)

Congrats @paulangusbark! Please both of you, do share those contracts! It’d be nice to standardize in the long run those verification contracts similarly as the community uses somewhat standardized OpenZeppelin contract snippets for many standard tasks. Presumably PQ signature verification will be just as standard as an ERC20 interface.

1 Like

Congrats,

It is great to have other implementations (targeting a different security level).

We are very interested by the contracts as well (mainly regarding the core NTT operations optimizations).

It might have been unnoticed, falcon512 and Dilithium are available here since a few months.
FALCON verification takes as input a precomputed NTT representation of the public key as mentionned above. FALCON keys are packed in uint256, dilithium keys are stored in an external contract.

The NIST KATS are successfully passed, providing confidence on the core part of the algorithm.

For both keccak256 based version and fully nist compliant versions are provided, along with signers (and also a hardware signer app for Dilithium44).The 128 bits level of security is picked, as it is the common target for Ethereum.

Gas cost for FALCON512 using keccak256 is 2M, 6.6M for Dilithium.

This can be used to experiment starting today, until precompiles are adopted.

The EIP-8052 (DRAFT) diverges from previous proposal by

  • spliting the FALCON computations in two part, allowing a more zk-friendly to be adopted for the hash2Point part of the signature, having in mind the ZK endgame.
  • take as input the NTT representation of the PK

This separation is not possible for DILITHIUM, so EIP-8051 just stick to the standard.

1 Like

I’ve created a discord channel with instructions at discord.gg/PUFcQezy

The signature is the salt followed by the signature encoded as bytes mod q (I didn’t use the encoder/decoder from the falcon implementation except for loading the public key); so 2068 bytes (no header).

A domain is set on wallet creation and is immutable; the value is used in the message hashing function. Admittedly, I asked ChatGPT to write me a function using just the 32 byte input and was never able to make anything better; I suspect what I have is loosely based on your code except I added a domain value. I also calculate the point values as I iterate through the last iNTT transformation to reduce the number of loops.

To save on gas, I calculate half of the norm with the submitted signature, then convert that memory allotment to the other half of the signature and calculate the second half of the norm. I also used the unchecked feature during the NTT conversions.

I’ll publish it soon. My repository has other contracts I’m not ready to share but I’ll just weed out the relevant parts into a different repository and make that public.

If you are in London, I’ll be at the Ethereum London event at the Encode Club tomorrow.

1 Like

Thanks for the detailed breakdown and for setting up the Discord channel.

The signature layout (salt || coeffs mod q, 2068 bytes) is a clean approach — avoiding the Falcon header compression makes the verification path much more straightforward for on-chain use. The merged iNTT + partial-norm computation is exactly the kind of optimization that matters at scale.

I’m working on similar verification infrastructure but from the ML-DSA-65 side (FIPS 204). My focus has been on post-quantum proof-of-control for validator recovery and verifiable randomness for L2 sequencers. The domain-separated message construction you described aligns closely with what I’ve been using (validator index + credentials + PQ key hash + domain), so I’m very interested in comparing how our hashing and replay-protection patterns match up.

Once you publish the extracted repo, I’d be happy to:

contribute ML-DSA-65 test vectors in a compatible format

run side-by-side gas comparisons (Falcon-1024 vs ML-DSA-65)

help sanity-check domain separation patterns for cross-contract reuse

I’m not in London for the Encode event, but very interested in collaborating remotely. Let me know when the repo is public and I’ll start experimenting.

I’ve published it to GitHub - Cointrol-Limited/QuantumAccount: An implementation of an ERC4337 wallet that uses FIPS 206 (falcon-1024) for signature verification

2 Likes

Great to see the Falcon-1024 implementation! I’m working on a complementary approach using ML-DSA-65 (Dilithium, FIPS 204) for L2 sequencers and AA bundlers.

Your QuantumAccount architecture with Montgomery multiplication pre-transformation and Keccak-based hashing is an excellent reference point. The ~10M gas verification cost you achieved is impressive compared to the initial 40M implementation.

My dual-signature approach (ECDSA + ML-DSA-65) targets similar use cases but with different trade-offs:

  • Signature verification optimizations using NTT domain transformations

  • Hybrid classical/PQ migration path for existing infrastructure

  • Statistical validation (NIST/Dieharder/TestU01 BigCrush)

  • Target latency: ~14ms for off-chain VRF operations

I’d be very interested in:

  1. Gas cost comparison: ML-DSA-65 vs Falcon-1024 verification

  2. Signature size trade-offs: Dilithium (~2.4KB) vs Falcon (~1.3KB)

  3. Montgomery multiplication strategies for both schemes

  4. Hash function choices: Your Keccak approach vs standard SHAKE256

Happy to collaborate on cross-testing or benchmarking.

ML-DSA-65 Ethereum Verification: https://github.com/pipavlo82/ml-dsa-65-ethereum-verification

1 Like

To clarify, I’m not formally trained in this space. I’m best described as a hobbyist mathematician.

As a slight caveat, I wrote a contract seven years ago that can arguably be used to measure randomness. The address is 0x16FA8DF7F16f9E41B7C5522Cc12a22053A2a776F

It assumes a Gaussian relationship with paired dice rolls when compared with their histogram typical spread. I did it on how often seven was rolled but it can easily be applied to rolls less than five, since that probability is arguably equivalent in probability just like rolls greater than nine. And the die doesn’t need six sides.

Technically, it cheats on the value of e, but it can be rewritten as a ratio of e and pi. It is a twist on Buffon’s experiment.

And to expand upon that. That contract measured the frequency of rolling 7 versus anything else. It has limitations but I had every roll be an event so they could be queried after the fact. There are thousands of rolls there as a initial data set. You can take a sample value and mod 12 it (assuming it’s 2 bytes) and use a second value randomly to verify.

That’s a solid approach. Using paired dice rolls with Gaussian distribution to test randomness deviation is mathematically sound - the histogram analysis gives you a measurable metric for quality.

The Buffon’s needle adaptation with e/π ratio is an interesting geometric probability approach, especially converting it to discrete outcomes instead of continuous positions.

The event-based storage is practical - having thousands of rolls queryable means you have an on-chain dataset for testing. The mod 12 + secondary verification approach could work for testing arbitrary byte streams.

I’ve cloned your QuantumAccount repo and I’m reviewing the Montgomery multiplication pre-transformation that brought gas from 40M to ~10M.

One potential application: ML-DSA-65 generates internal nonces during signing. Running those bytes through your mod 12 + histogram test might catch biases that standard NIST/Dieharder suites don’t detect, since your test uses different statistical assumptions.

I’m working on getting my ML-DSA-65 verification contract to comparable gas efficiency - currently exploring NTT domain optimizations. Will share test vectors once I have results worth comparing.

Your 2017 contract shows solid understanding of on-chain statistical verification - useful reference for this kind of work.

1 Like

ML-DSA-65 NTT Progress Update (Solidity)

Quick update on the ML-DSA-65 verification work.

What’s done

Fully working NTT and INTT for ML-DSA-65 parameters

Correctness confirmed on roundtrip tests (NTT(INTT(x)) = x mod q)

Basis vector tests and randomized vectors pass

Matches the mathematical structure of the ML-DSA/Dilithium reference implementations

Current gas numbers

Parsing public key + signature: ~235k gas

One NTT→INTT cycle (256 coefficients): ~2.7M gas

Full ML-DSA-65 verification expected around 7–9M gas
(roughly aligned with Dilithium at the same security level)

Montgomery multiplication results

Tested whether Montgomery reduction helps.
It doesn’t. For q ≈ 8.38M (a small NTT-friendly modulus), Montgomery math on the EVM is 2–6× more expensive than native mulmod.
So ML-DSA-65 stays with direct modular arithmetic.

Real optimization targets

These are the areas that actually move gas down:

Precomputing NTT(publicKey) (saves ~400k)

In-place NTT/INTT operations (~200k)

Reducing calldata→memory copying (~150–250k)

Memory layout cleanup and loop tightening (~50–100k)

A realistic target after optimization: 5–6M gas for full verification.

Next steps

Implement polynomial arithmetic

SHAKE256 challenge

Norm checks

Full verify() pipeline

Begin aligning with unified PQ verification interfaces (ML-DSA, Dilithium, Falcon)

Repo: GitHub - pipavlo82/ml-dsa-65-ethereum-verification

@seresistvanandras Completely agree — a shared, standardized layer for PQ
verification would benefit the ecosystem enormously.

I’ve been coordinating with @paulangusbark and @rdubois-crypto, and there’s
clear alignment on the need for:

• a unified IPQVerifier interface that works across Falcon, Dilithium and ML-DSA
• standardized calldata layouts per algorithm
• a NIST KAT–compatible test vector format
• a shared gas benchmarking methodology

Very similar to how OpenZeppelin helped solidify ERC patterns, having
composable PQ verification primitives would lower integration friction for
protocol teams, wallets and AA infrastructure.

Current status:
I already have an initial IPQVerifier interface draft and ML-DSA-65 test
vectors, and both @paulangusbark and @rdubois-crypto expressed interest in
aligning implementations.

Proposed next step:
I can prepare a standardization proposal for community review within 1–2 weeks,
covering:

  1. Abstract interface specification
  2. Per-algorithm calldata formats
  3. Reference test vector schema
  4. Gas benchmarking framework

Would you be willing to provide technical review before broader publication?
Having EF validation early would significantly strengthen the proposal.

Repo: GitHub - pipavlo82/ml-dsa-65-ethereum-verification

Well for the IPQVerifier, we planned to follow the openZeppelin ERC7913 IVerifier, parameters verification shall be performed internally.

Concerning the Keccak hashing, a PRNG was designed by Zhenfei (FALCON co-author) during our collaboration with EF. It is roughly a CTR-mode build with keccak as central permutation.
If you need SHAKE, it is available in our repo, it has an expensive cost, which will vanish once the EIP is adopted.

3 Likes

@rdubois-crypto Got it, makes sense to just use ERC7913 IVerifier
directly. I’ll drop the separate IPQVerifier idea and keep the ML-DSA-65–specific bits internal so we don’t fragment interfaces.

Re: SHAKE256 – I do need FIPS-204 paths for ExpandA and poly_challenge, so I’d much rather reuse what you and Zhenfei already built than reinvent the Keccak/XOF stack. Could you point me to the contracts you consider “canonical” for:

  • the Keccak-CTR PRNG you mentioned

  • the SHAKE / XOF instance you expect to keep once the EIP lands

I’ll wire the ML-DSA-65 verifier directly to those so we stay aligned.

On the side I’m also working on a small “gas per secure bit” comparison (ECDSA baseline vs Falcon / Dilithium / ML-DSA-65, both L1 and a few L2s). Not trying to rank schemes, just to give protocol / wallet teams a normalized cost metric and some harder numbers to back the precompile discussion. Once the ML-DSA-65 pipeline settles and the numbers stabilize, I’ll share them here so they can feed into the broader standardization work.

Repo for context: https://github.com/pipavlo82/ml-dsa-65-ethereum-verification