On Block Sizes, Gas Limits and Scalability

Very nice summary and post @Nero_eth .

I’m trying to get caught up on the recent discussions surrounding gas limits and fees.

For background, been working on creating a “CPI Index” as it relates to Ethereum fees. Part of this requires bucketing transactions into buckets/categories based upon their Method IDs. Then tracking the frequency of those transaction types over time to compile a basket of typical transactions over time.

Then from there, creating a weighted basket of what is a normal TX over time in terms of gas usage and the avg gas cost.

Our v1 did shed some interesting insights which I’ve written about on espresso(.)jlabsdigital(.)com (post called Ethereum’s Great Depression).

What becomes apparent when doing this is that gas limits tend to get adjusted (or blob fees added) when congestion is high. This reduction in the cost to transact has economic spillover effects that in turn, create deflationary conditions as that congestion subsides. And I don’t believe there has been enough attention or focus on how this impacted Ethereum… Especially as it looks to push some usage/users to L2s.

What I’m working up to here is a question on whether there has been exploration about targets being more dynamic, and allowing that target gas usage to drop below 15 million.

The goal would be to create better price stability on the ecosystem.

Additionally, there does appear to be heightened price volatility to the upside for a variety of reasons… More recently we saw gas costs jump from sub 10 to 112 in a day. This is a lot of price instability/volatility in the gas market, which hinders app development, revenue forcasting, and other builders to function in more predictable environments.

If the goal for Ethereum is to maximize usage and production of its platform, it needs greater price stability. W/o, you lose users and builders.

So my question here is based upon how to maximize this through the fee market with a goal to create a CPI index for ethereum… With the ultimate goal to propose a solution that can also respect the need to combat ddos of the network.

I’m still catching up to the most recent dialogues, so any help in pointing me in the right direction would be greatly appreciated. Thanks

The reasoning’s extremely similar to that of the argument to keep Bitcoin’s block size limit at 1MB. Large corporations will have access to far more capital than individuals or small businesses, and therefore they will be able to run more expensive, faster hardware that can handle more data/transactions. This presents a centralization risk.

1 Like