Simplified Active Validator Cap and Rotation Proposal

I suppose it depends on how you are looking at variance. You appear to be looking at it from the point of view of all validators over a relatively short time period. I’m looking at it from the point of view of single validator over its lifetime. So if there was a queue then once the validator becomes active its variance would basically be 0. Ultimately, those who validate will be more interested in the variance of their validator(s) than those of the network as a whole.

Yep, but this is at least an informed decision. At the point in time someone considers becoming a validator they can look at the active set, queue, and rate of exit and make a call as to if they want to enter the queue or not. The current proposal is a rewrite of the rewards for all validators, which would be tough to take before at least the option of withdrawal were available.

2 Likes

I guess the better word than “variance” is “fairness”. Some validators getting much more and others getting much less due to random chance is bad, some validators getting more and others getting zero because of who came first is also bad (even more bad imo!).

The current proposal is a rewrite of the rewards for all validators, which would be tough to take before at least the option of withdrawal were available.

To be clear, I’m not proposing this before the option of withdrawals; in fact I don’t think the current roadmap (Altair, then full-speed-ahead to the merge, then withdrawals and other beacon chain hard forks) leaves open any possibility of doing this before withdrawals.

1 Like
  1. Low long-term variance in validator income

Is this meant to be a claim when comparing to current model or a cap model without the dynasty mechanism? If the former, I don’t think the income variance will be lesser than current model considering your validator may be put to sleep for an unknown period of time. Maybe should specify that it’s a low income variance across validators, because the long-term income variance of any single validator is just as high as with a model without a cap.

Right, the claim is just that it doesn’t make variance much worse; the variance that exists in the current system still remains.

I think a queue could lead to various security issues, since such a quasi-permissioned system will enable stakers “on the inside” to exercise undue control over the chain. It can also lead to rent-seeking behavior.

3 Likes

Once the active validator set size exceeds MAX_VALIDATOR_COUNT , validator returns should start decreasing proportionately to 1/total_deposits and not 1/sqrt(total_deposits). But the functions to compute total_deposits -> validator_return_rate and total_deposits -> max_total_issuance remain continuous.

Perhaps you could clarify, this means that there will be a hard cap to the total issuance? Because there are two obvious options: (A) either the active validators are compensated in proportion to the rate at which they will be inactive, keeping 1/sqrt(total_deposits), or (B) they are not, and the proposal caps total issuance? I think it would be favorable to cap total issuance at the point where the total number of validators is sufficient for security, but only at that point. This lowers the “maximum inflation rate”, which adds additional trust to the long-term economic model of Ethereum.

I should note something in relation to my previous comment to jgm. When I read the words “validator cap” in the headline before reading the proposal, I felt uneasy because it sounded like the proposal may contain this type of queueing system that is undesirable due to aforementioned reasons. I was pleased when I read the proposal that it did not. It also seems the best to do this after withdrawals, as you all agree.

1 Like

To note as well that variance is important because of its centralising effects. In a system where the rotation is long (you could be asleep months before your validator comes awake), entities who control many validators have stable income/much lower variance, which assuming they reinvest the income into turning on more validators, only increases their weight over time. Same with an activation queue: insiders who are earning rewards will congest the queue much faster than outsiders looking to join in. So I like much better a system that shuffles validators quickly enough.

Yes, and that hard cap would be the issuance level at the cap. You can compute the issuance as \frac{64 * \sqrt{deposits} * 31556925}{384 * \sqrt{10^9}}; if deposits are capped at 2^{25} ETH that comes out to \approx 963426 ETH per year issuance.

I agree with your points that this also provides more guarantees about ETH’s long-run supply, which is good. It does mean that validators potentially have lower incomes, but they get compensated by (i) lower operating expenses and (ii) the possibility that when asleep they’ll be able to exit faster.

2 Likes

Thanks. Is it feasible to set the hard cap on the number of active validators as a proportion of circulating supply instead of using a fixed power of two? Such a design aligns somewhat with the idea that the security budget should be adapted to the value of what it protects.

If the circulating supply of Ethereum was to decrease through the deflationary mechanism of EIP-1559, a downward pressure on the amount of staked Ether could be acceptable. For example, it would only take 23 years with a deflation rate of 3 % for the circulating supply of Ether to be halved. A hard cap initially set to roughly 1/3 of all Ether would at that point represent 2/3 of the circulating supply, which may be unnecessarily high. I use “may” here, because if the total value of tokens secured by Ether was to rise significantly in relation to the market cap of Ethereum, it could still be desirable from a security perspective with a rising proportion of it being used for staking (this gets complex).

If the circulating supply of Ethereum was to increase, it could only do that at a maximum of 1 %, and likely much lower. This means that the hardware requirements could only increase very slowly. There is of course in this case also the possibility to still hard cap the number of active validators but compensate proportionally with higher rewards until 1 % inflation is reached; (A) in my previous comment. This however breaks property 3 of the proposal.

A hard cap initially set to roughly 1/3 of all Ether would at that point represent 2/3 of the circulating supply, which may be unnecessarily high

I don’t think this is much of a concern; remember that even in this proposal, it’s theoretically possible for all the ETH to be locked up, it’s just that the number of validators awake is bounded to ~1M. So from a macroeconomic point of view this proposal doesn’t really give us any properties that were not there before.

After thinking about it a little more, I agree that the active validator cap can and should be hardcoded. The reason is that the number of validators specifies a sufficient level of decentralization, which does not directly depend on the circulating supply (though perhaps log-linearly depend on the size of the user base). Rather, it is the required number of Ether for each validator that in the future could be changed in adaptation to the circulating supply.

Just to briefly encapsulate this tangent point:

To my understanding, as the current protocol is written, the circulating supply of Ether will eventually reach an equilibrium, at which point total staking rewards are equal to burned Ether. This equilibrium is likely to be at a significantly lower supply than the current supply (at that point, staking rewards, as a proportion of the total circulating supply, would be higher than what they are today). Some parameters are tuned based on the current circulating supply and will not automatically be adapted to such a lower circulating supply. One such parameter is the required number of Ether per validator.

@vbuterin where does 2^19 comes from? is there a particular target “resource” package (memory, CPU, I/O, networking) we would like to adhere to?

I guess 2**19 is just roughly the level of validating ETH that we expect; the goal is to make the maximum load close to the expected load.

I don’t think there’s necessarily a specific “target” where we’re okay going up to the target but not beyond it; it’s more like, the lower the resource requirements, the better, and if you push the requirements even lower, there’s even further benefits that you unlock, but what we do know is that fixed load is better on users than variable load, because in the variable-load case users would have to buy hardware to cover an extreme case that in practice may not end up ever being reached.

@vbuterin thanks for the explanation.

I think it’s important to pay special attention to the implied social contract between the Ethereum community, and stakers.

I’m concerned many stakers may be thinking of the rewards formulas as ‘set in stone’, and be basing their investment decisions on that idea. I don’t recall the original rewards documentation providing warning to stakers about possible future changes to the rewards formula, and thus they may not be expecting it.

I want to avoid a situation where people feel treated unfairly, and thus decide to exit their staking position based on emotional rather than economic factors.

The analogy would be a computer game where a player put in considerable time and resources to optimize a specific aspect of their character, only to find things suddenly changed one day :wink:

I think it would be good to have more proactive communication and engagement with the staking community at large, with ample notice and discussion before any changes are made.

4 Likes

Thank you for making this point about engaging community well in advance of such changes.

I just heard Justin Drake talk about this in an interview post Shapella launch. That triggered a discussion in the Rocketpool community, which led to this comment.

Currently there are around 500,000 validators, but only about 2,000 RP node operators.

If there is a cap on validators set at 2^19 = 524,288, then we are already at this limit and unfortunately the validator seats are predominantly owned by large staking farms. This seems to be in a stark contrast with ETH’s mission to decentralize at the social layer (layer 0).

Yes, there is the correlation penalty, but it does not seem like the broader community is aware of it and unless there are actual disastrous events that hurt delegating stakers, this penalty itself is not enough to incentivize solo staking.

IMHO before setting any limits on number of validators, there has to be a lot stronger incentive for solo staking. Something along the lines of proof of unique location or ZK KYC.

On a broader scope, for a time there was a public campaign to improve objective decentralization metrics such as Nakamoto coefficient, Gini Coefficient and Total Network Resilience. Is anyone still tracking these for ETH?

4 Likes

@vbuterin, Do you have an estimate on whether the maxEB upgrade (EIP-7251) to increase the amount of ETH each validator can stake will reduce the need for a future hard limit on the amount of total ETH that can be staked similar to the concept you originally brainstormed above?

1 Like

That depends on how much consolidation people opt for. At full consolidation (2048 ETH), we can shrink the 1M validator set to ~15,625 validators. This assumes every validator is staking exactly 32 ETH, so 32 * 1,000,000 = 32,000,000 ETH / 2048 ETH = 15,625 validators.

@vbuterin has mentioned the possibility of a higher 4096 ETH limit for the maximum effective balance elsewhere, so that number could well reduce. The question is whether people will opt to fully consolidate, consolidate smaller amounts (e.g., consolidating four 32 ETH validators to create a 128 ETH validator), or consolidate at all.

But if we have enough consolidation, the need for an active validator cap reduces. AFAICT EIP-7251 was proposed to validator set capping + rotation and reducing stake rewards to disincentivize staking (both of which are controversial and have edge-cases). I published an analysis of EIP-7251 if you’re interested in reading through!

2 Likes

Much appreciated for the clear outline of potential outcomes post EIP-7251 @emmanuel-awosika! It seems that there will be quick feedback based on the number of validators reduced soon after 7251 is live.

2 Likes

Yes, I believe so! I’m really hoping there’s more awareness about the importance of consolidation, though. If you don’t have enough operators consolidating validators, the contracting effect on the validator set is going to be limited.

1 Like