MEV Auctions Will Kill Ethereum

At SKALE we are using 2/3-N-threshold encryption to provably remove MEV.

A transaction is TE-encrypted on submission. It is then committed to the chain, decrypted by a committee and then processed.

5 Likes

I was recently reviewing the price impact literature from the economy for another reason, and there is a ton of articles about different market irregularities. Unfortunately, constant-function market makers meet the requirements for almost all of these irregularities. For example with a 30 seconds search, you find this article among many others, and first of all, we don’t have any transient price impacts in our models. Moreover, we also don’t have temporary price impacts. And finally, none of our functions are convex, which is a real surprise for me as I am used to usually pick a random economy question to design convex optimization problems for my final exams.

I am not saying that constant-function market makers are inherently a bad idea. I am just trying to tell that we have at least a century of how different price impact models can safe-guard against front-running to at least ameliorate the undesired effects. And what I see here is that the wonderful innovations of the Ethereum community are just reinventing the wheel little by little.

In summary, I totally disagree with the sentence, “we also observe front-running in traditional finance, so it is inevitable!” We might be able to cut down MEV into a negligible fraction before considering how we are going to deal with that unavoidable remaining part.

3 Likes

Ah so you worked on submarine sends. Nice one- it’s a great idea! It was the solution I was reaching for when I first looked at this stuff and I was very cheered when I found you’d done it. But yes, I get that people like to do things in one block, apparently whatever the cost.

On that point, do you or any of your geth guys know… how easy would it be to add a msg.IsFinalTxn global boolean to the EVM?

If you could tell that you are the final transaction for the executing contract address you could trigger a simultaneous MEV-free calculation for all transactions in the block without needing an n+1.

You could then fix all this stuff definitively at app level.

I will definitely check out the roast- thanks for the link.

Really, the more I think about this… once MEVA is released all Binance Coin has to do is make an announcement that they will not extract MEV from their users and people will move over to them in droves.

They can run an explosive ad campaign using MEV-Inspect data to show what a better deal they are for their users- and they’ll be right! What a massive own goal for Ethereum.

Plus what do you think the SEC is going to make of our legitimizing frontrunning? They’ve just about calmed down about ETH being a security. Why poke that particular hornets nest by building violations of their laws into the protocol.

Decentralization takes a lot of work- it’s expensive. As soon as a decentralized solution becomes less fair than a centralized competitor as well as more expensive, it’s all over.

Thanks for the links. Is the general argument that the constant product is inferior to say an order book because price impacts are by definition permanent with constant product because each trade changes the balance of the pool permanently?

I actually really like the ideas in Uniswap v3 in terms of concentrated liquidity and range trades- a sort of order book of constant products. It seems genuinely innovative. But again, they’ve done nothing to address MEV and the transaction rate will be far higher as people move their ranges around.

I think we have to make a distinction between white-hat arbing and black-hat arbing. White-hat would be where a price is moved back to fair value by arbers. That’s market efficiency. Black-hat would be sandwiching your victim’s order between trades such that you extract money from them due to an unfixed exploit in the network and completely irrespective of fair value. A lot of the literature won’t be considering this situation (yet) because it is so extreme and simply hasn’t existed risk-free until Ethereum- which does not reflect well on Ethereum.

4 Likes

I’m not suggesting removing permanent price impacts but instead adding the other types of price impacts as well. And I am not an expert in this field, but I am trying to just copy what is being used in the economy. However, I may have a very limited intuition about why it might help.

For a simple example, if you can reorder the transactions in an AMM, you can just break down an adversary large order into smaller ones and interlace them with opposite matching orders to pay almost zero slippage. This slicing trick is mutual and also exists in traditional portfolio execution. The part that you can reorder transactions is obviously an instance of MEV. However, the fact that even by this reordering you can bring down the slippage to zero because you don’t pay any kind of temprory or transient price impact is also exclusive to AMM right now.

If we had some kind of those price impacts, at least the margin of profit for this example of MEV would have been reduced. That is all my point! Yes, front-running exists in traditional finance as well but it seems that here it is way more profitable. We might want to cut down the margin of profit by these simple measures before doing something more extraordinary like MEVA.

Interesting post! Just want to quickly disambiguate two concepts:

  • MEV Auctions #1 – Auction off the right to sequence transactions.
    • I’ll call this "Sequencer Auction"s for this response. This is the topic of the post you linked to.
  • MEV Auctions #2 – MEV-Geth where traders can express more complex transaction inclusion preferences than a simple gas price auction (GPA).
    • I’ll call this MEV-Geth (just for this post). This is an awesome project by flashbots and is being used by L1 PoW miners.

Sequencer Auctions are simply the auction mechanism for selecting who is the next sequencer. In fact there is a bit of a ‘sequencer auction’ in Ethereum L1 – it’s the PoW which determines the next block proposer. The insight with sequencer auctions is that instead of selecting block proposers by burning energy, sequencers can instead pay for the privilege of selecting contents of the next block.

Sequencer Auctions can and should be coupled with things like submarine sends/time-lock encryption for reducing total possible extractable MEV. However, for any remaining MEV (like slow market arbitrage), we can at least give the profits back to users instead of distributing them to random sequencers (aka miners).

As for MEV-Geth, MEV-Geth is software that any individual sequencer (or block producer) would run to extract a block’s MEV in a standard way that allows traders to be involved & express their preferences.


So one way to think about this is that there are 2 auctions that occur which are fundamental to the blockchain designs we’ve seen so far:

  1. the auction for being the next block producer (sequencer auction), and
  2. the auction for what gets included in the next block (GPA/MEV-Geth).

The term MEV Auctions have unfortunately been used to talk about both of these distinct functions. I guess that’s the downside of using super general terms like MEV Auction! Whoops!

3 Likes

MEV evolution naturally leads to block proposer and MEV extractor becoming the same entity. Small ETH2 block proposers will not have computational resources for MEV extraction. Ultimately the largest staking pool/MEV extractor will be the most profitable.

This by the way will burn huge amounts of electricity since MEV extractors will run neural networks on graphic cards. The funny thing is that PoS may become as eco-unfriendly as PoW.

The largest MEV extractor will naturally be the most profitable one.

Then, for the largest MEV extractor it does not make any sense to run an auction, since MEV extraction value will be more than the auction profit (this is because the largest extractor is the smartest and has the most computational resources)

Even if auctions are introduced, the block proposer/MEV extractor can trivially fake them and censor out undesired parties as people above mentioned.

This brings us back to threshold encryption on a blockchain as a way better MEV elimination mechanism.

Thanks Karl, that is very helpful to point out. I have conflated the two projects in my discussions and I will distinguish between them from now on where relevant.

I am extremely pleased to hear that we are attempting to reduce MEV in the sequencer and to only use auctions for the remainder. Are we doing this with the block producers too? Our aim must be to make MEV so low that it’s not worth bidding for in an auction.

Here are a few ideas to chew over:

Our problems with MEV are because Ethereum is not fully decentralized.

Block structure is fully decentralized. Blocks are proposed and validated by consensus across tens of thousands of nodes.

Block content is created by a centralized authority (miner/validator).

In short, block content is not trustless.

There is a historical reason for this. The Ethereum devs had their hands full in the run up to genesis. Creating the world’s first and best blockchain smart contract network was a massive deal and rightly took all of their stretched resources to complete.

As a result the consensus mechanism had to be largely borrowed from Bitcoin. In Bitcoin the transaction order within a block is irrelevant. Transaction censorship isn’t really a problem either, just an inconvenience. As the MEV analysis shows, this is very much not the case with smart contracts. It was understandable at the time but it’s 6 years later now and we’ve seen the harm it causes.

Addressing the hidden centralization in block content creation is where I feel our energies should be directed. I would love to see all these sharp minds getting stuck into this problem.

5 Likes

Words of wisdom!!

Blocks are proposed and validated by consensus across tens of thousands of nodes.

Ooops … why do you think there are thousands of nodes? For PoW, pools control proposals (I think there are roughly 10-20 of them)

1 Like

The node count is >10000. I get that the proposer count is far less.

But my point is whatever deficiencies the structural layer of the consensus may or may not have, it’s a lot stronger than the content layer which is… non-existent.

This is a technical problem. We need developers to fix this problem, not the market. There’s a market for stolen credit card details. Perhaps it’s wonderfully efficient and a great example of the free interplay of supply and demand. But it never should have had the opportunity to exist because, like MEV, it is the product of an exploit that never should have happened.

Just saw this (it isn’t me btw)

2 Likes

It’s a good idea, but you can do it more simply and efficiently than that with a simultaneous constant product calculation. I’ll have more to say on application layer MEV fixes like this soon (which will include a model for this). For now I am concentrating on a content layer fix.

I have an admission to make to Flashbots @thegostep. I now understand that as a short term fix, MEV-Geth reduces gas prices and transaction bloat etc, and I agree that right now on mainnet it is net positive and a force for good. My apologies for lumping it in with eth2/rollups MEV auctions.

My fears lie in organized MEV auctions/extraction continuing into eth2 and rollups.

What I feel we must avoid is fostering the same culture of entitlement with validators/sequencers that we currently have with miners.

On Monday I will post my ideas for fixing MEV in the content layer. It is sadly too late to apply them to mainnet because it is so against miner’s interests to adopt that it would likely create a fork and destabilize the network.

But we get a clean slate with validators/sequencers…

We need to start putting out the idea that as a validator/sequencer you will not be entitled to (or even be able to) exploit users for MEV the way that miners currently can, and that this is for the long term good of the network. Because it is!

Traditional finance is never going to move over to Ethereum in a serious way while it is as exploitable as it is, and if they do, it will be for the wrong reasons- because they want to exploit it themselves!

Anyway more from me on Monday, in what will be a far more upbeat thread.

I am not suggesting copying every notion of traditional finance. I am talking about the specific notion of slippage and slippage is not designed so that people can exploit other people. In this case, why we don’t have this in DeFi is not because we don’t want to exploit our users. Instead, it is because we want to provide a better UX for greedy users! I talked about just one instance in my previous comment. Another one is that in Uniswap v3.0, if all those greedy users concentrate all their liquidity on a very short interval, a whale can just buy all the reserve of one token in a trading pair with zero slippage whatsoever!!!

I’ve enjoyed reading the conversation here over the past few weeks and i agree with the threat that built-in extractable value poses. I’m surprised no one is talking about chainlink’s proposal to abstract sequencing into an oracle layer. Seems like a pretty novel approach to me, but i am justsomelurker, and now i will return to my shadows :slight_smile:

pg 48 a5511b75-559d-441c-8142-2b5226a9e332.pdf

1 Like

Thank you for emerging from the shadows to contribute a solution @justsomelurker. :wink:

The Mempool-based FFS could be a good route. The oracle nodes should be incentivized to spread out geographically as much as possible so that no PoP has any particular advantage. I don’t understand how you enforce miners to respect the consensus though.

One problem with fair ordering as you’ve described it which I think is missed in the academic literature is this:

Imagine a juicy Uniswap txn A enters the mempool that everybody wants to frontrun (sadly also a years salary for the victim).

We have to assume oracle nodes are as self interested as miners. Oracle nodes each add a transaction infront of A to frontrun it, and send their view of the market.

Let’s keep it simple and have 4 oracle nodes that all want a piece of the action, so they each insert their own transactions BCD and E infront of A in turn:

BA
CA
DA
EA

Now they don’t agree what txn 1 should be, but they do all agree that A came in later and should be txn 2. Except that it didn’t and it shouldn’t. It’s irrelevant to A’s bad outcome who is infront of him, only that someone is.

If you try to fix it by weighting the consensus so that the more agreement you get about the position of a transaction the earlier it goes, you get crazy effects like a transaction that everyone agrees is last printing first in the block.

Don’t get me wrong. What you are proposing is lightyears better than the literal worst case scenario of total miner/validator/sequencer dominance leveraged by mev auctions that we have now and I would like to see it replace that in the absence of other options.

But content consensus must not be any more optional than structural consensus and averaging out different views of transaction order to achieve transaction fairness is more problematic than people realize. You’ll get my take on Monday.

2 Likes

Thank you for the reply. For the record i am not associated with the authors in any way, i just like lurking here, read the paper, and found it relevant to the discussion.

Let’s say there is a juicy tx from uni, if an attacker wants to extract value by reordering tx, the attacker would need a statistically significant amount of nodes to report identical ordering, otherwise it will not pass through the aggregation validations and the nodes will be recognized as outliers, subsequently booted, and staking funds lost. With the numbers of oracles in a given oracle network, nefarious behavior for self-interest is nearly impossible, + large rewards for the identification of bad actors.

Even if a majority of oracles were somehow able to identify one another and coordinate re-ordering for value extraction, the paper proposes that the total value required to bribe a majority of nodes (with an amount greater than the value staked) is significantly greater than the value available in the reordering.

Sorry if i’m missing something, thanks for the reply. Cheers.

At SKALE we are using 2/3-N-threshold encryption to provably remove MEV.

I feel like this is an underrated comment. @kladkogex I’d be curious for more details on how this works.

As I see it, the problems with MEV are mostly related to mempool transaction privacy. MEV searchers require opportunity protection from other searchers and users require exploitation protection from searchers. If transactions can be encrypted until they’re finalized, MEV would be limited to keeper-related transactions. This leaves searcher-operators who would be able to extract more value than a typical operator, but I’m not sure how you’d solve that or if you’d need to.

1 Like

I agree. Alex improves the MEV situation greatly but the tx rate may go up due to stat frontrunning battles.

Here are some very very early thoughts on an encrypted mempool version (Dark Alex) if you are interested.

Also @samueldashadrach and @Nickoshi you’ve had some good ideas on encrypted txs if you want to take a look. And anyone else…

Dark Alex - An Encrypted Content Layer Protocol (Under Construction)

Thank you Tristan :slight_smile:

We are working hard to make it easy to use. Basically you will mark one of your Solidity arguments as encrypted, and then it will be sent encrypted by the client, included into the block proposal in encrypted form, and only decrypted after the proposal is committed as a block.

The implementation on SKALE production network should be ready by this summer.

Started reading up on Alex and Dark Alex, and I have some thoughts I want to share:

  1. I don’t think shuffling is enough. MEV searchers need mempool privacy for keeper opportunities. They will get it one way or another, most likely through deals with large pools, which has a centralizing effect.
  2. If you have mempool privacy until the transaction order is established, I don’t think you even need shuffling. The one exception to this would be if we desired to stop proposers from placing their encrypted transactions at the beginning of a block, but I don’t think that’s a bad thing. I think it might be a good thing if we formalized a good keeper design as one that sent the incentives to the coinbase… but that’s a slightly different topic.
  3. Dark Alex suggests that encrypted transactions would hide gas prices. What if we only encrypt the sensitive parts of a transaction, and allow the encrypted transaction to be valid enough to waste gas even if it’s never decrypted?

With these points in mind, I think Dark Alex could be simplified. I’m just brainstorming here, but what if it worked more like this:

  • When creating a transaction, the sensitive parts are encrypted. The encryption key is chunked and split between some selection of validators using their pubkey such that it satisfies threshold encryption honesty assumptions. All of this is included, chunks and which validator they belong to, in one transaction and sent out.
  • Blocks are expanded to include both decrypted transactions and a new block draft with all encrypted transactions. It’s the block proposer’s responsibility to take the draft block from the last block that was added to the chain and decrypt it by requesting key chunks from all the validators who were selected in the transaction. Additionally, the proposer picks new encrypted transactions and creates a new draft block for the next block proposer to decrypt. From there, the block is formalized and the transaction order is attested to by comparing to the draft.

In short, the “shuffler” is replaced by encryption, the “picker” becomes the last block producer, the “printer” becomes the current block producer, and the transaction submitter chooses the “vaults” (maybe just default to the last N block proposers who didn’t miss, although I don’t think there’s an issue with a transaction submitter selecting them by custom means).

1 Like

Yes!

We have an implementation of Threshold Encryption that is Solidity compatible.

It is in beta now, we are looking for people willing to contribute to the project (can issue SKL grants)

1 Like