[Bitcoin-development] Proposal: A measured response to save Bitcoin Core

Matt Whitlock bip at mattwhitlock.name
Sun May 31 00:29:25 UTC 2015


Greg, Pieter, Jeff, and Wladimir,

I'll try to be brief to respect your time.

1. I don't want to see Bitcoin die.

2. As has been discussed on this list and elsewhere: Bitcoin could potentially die due to economic and/or game-theoretic complications arising from raising the block size limit, but Bitcoin could also die due to usability complications arising from NOT raising the block size limit. Strong, personally held opinions by various members of this community notwithstanding, it is not clear which of these scenarios is more likely.

3. What *is* clear at this point is that Gavin will move ahead with his proposal, regardless of whether the remainder of the Bitcoin Core committers agree with him. If he has to commit his changes to Bitcoin XT and then rally the miners to switch, then that's what he'll do. He believes that he is working in the best interests of Bitcoin (as I would hope we all do), and so I do not fault him for his intentions. However, I think his proposal is too risky.

4. I also think that ignoring the immediate problem is too risky. If allowing significantly larger blocks will cause a serious problem for Bitcoin (which is a possibility that we cannot rule out, as we lack omniscience), then NOT making any change to Bitcoin Core will virtually *assure* that we cause exactly this problem, as the popular (non-technical) consensus appears to be in favor of Bitcoin XT and a larger block size limit. If we do nothing, then there's a very real chance that Bitcoin XT takes over, for better or worse.

5. I'd like to propose a way that we can have our cake and eat it too. My proposal attempts to satisfy both those who want larger blocks AND those who want to be extremely cautious about changing the fundamental economic parameters of Bitcoin.

6. Something I've never understood about Gavin's (et al.) proposal is why there is a massive step right up front. Assuming we accept his argument that we're critically close to running out of capacity, I still must ask: why do we need a 20x increase all at once?

7. It's not a given that blocks will immediately expand to meet the hard limit. In fact, there are strong and compelling arguments why this will NOT happen. But in any software system, if a given scenario is *possible*, then one MUST assume that it will happen and must have a plan to handle it.

8. My primary objection is not to raising the block size limit; my objection is to raising it *suddenly*. You can argue that, because we'll have plenty of time before March 2016, it's not "sudden," but, whether we do it now or a year from now or a decade from now, a step function is, by definition, sudden.

9. My proposal is that we raise the block size limit *gradually*, using an approximately smooth function, without a step discontinuity. We can employ a linear growth function to adjust the block size limit *smoothly* from 1 MB to 20 MB over the course of several years, beginning next March.

10. This is the difference between cannonballing into the deep end of the pool and walking gingerly down the steps into the shallow end. Both get you to the eventual goal, but one is reckless while the other is measured and deliberate. If there's a problem that larger blocks will enable, then I'd prefer to see the problem crop up gradually rather than all at once. If it's gradual, then we'll have time to discuss and fix it without panicking.

11. I am offering to implement this proposal and submit a pull request to Bitcoin Core. However, if another dev who is more familiar with the internals would like to step forward, then that would be superior.

Respectfully submitted,
Matt Whitlock




More information about the bitcoin-dev mailing list