[bitcoin-dev] Superluminal communication and the consensus block size limit

Jorge Timón jtimon at jtimon.cc
Wed Aug 5 23:24:03 UTC 2015


There is a common meme that block propagation times is the only metric
that matters when it comes to value the block size maximum consensus
rule's usefulness in limiting mining centralization.
Here is an extremely optimist thought experiment for those who think
that is the case:

Imagine that superluminal communication has somehow been invented and
validity of mined blocks can be checked in constant time thanks to
some sort of snarks magic. This doesn't mean that block propagation is
O(1) with respect to time because each node needs to repeat that cheap
validation before relaying the block.
Still, this is the best situation we can imagine with respect to block
propagation, right?
No, wait, due to some technical or economical miracle this
superluminal communication is free for everyone:
better-than-physically-possible (our understanding of physics changes
with time as well, right?) communication and infinite bandwidth for
everyone.

At this point, does the consensus block size maximum still help
limiting mining centralization or we can just remove it entirely?
The answer is yes, it can help limit mining centralization.

Let's imagine that these amazing advancements have happened in less
than 22 years and we only had 6 more subsidy halvings, that's only 7
halvings in total, so the subsidy is still as high as 50 * (0.5 ^ 7) =
0.390625 btc/block

Although the orphan-block-probability cost for a miner to include an
extra transaction has been completely minimized, it is still not null.
Let's assume that while all these technical miracles were
happening...that 22 more years was enough for miners to realize this
fact, they have removed the special-cased-for-free-transaction policy
code that currently comes with Bitcoin Core (or it has been removed
from Bitcoin Core) and they don't mine transactions with fees lower
than 1 satoshi anymore.
<sarcasm>I hope this last assumption doesn't turn out to be more wild
than superluminal communication...</sarcasm>

But there must be a physical limit: in our example, miners will have
different CPU constraints (to further simplify, genetically-engineered
and super-fast memory also grows in the streets everywhere after an
accident in a Monsanto Lab; or better downloadmoreram.com actually
works and I just hadn't tried from windows or mac).

Miner A is able to process 100 M tx/block while miner B is only able
to process 10 M tx/block.

Will miner B be able to maintain itself competitive against miner B?

The answer is: it depends on the consensus maximum block size.
How so? Let's imagine that it has been completely removed.

Assuming a fee of 1 satoshi per transaction and no shortage of
unconfirmed transactions, miner A's block reward will be 0.390625 + 1
= 1.390625 btc vs miner B's 0.390625 + 0.1 = 0.390625 + 0.1 = 0.490625
btc.

Difficulty will tend to increase until the cost to produce a block
(including interest in all the capital needed, paid or not) is equal
to 1.390625 btc and therefore miner B will stop mining or go bankrupt.
But maybe 100 M and 10 M were too high numbers. What about 10 M and 1
M? Still, 0.400625 btc can't compete with 0.490625 btc.
You think 10x is too much of a difference? Fine, 2M vs 1M: still
0.400625 btc can't compete with 0.410625 btc

In summary, there will always be some physical limitation that may
benefit big mining players, so the block size maximum will always be
useful to limit mining centralization.
In other words (and I don't intend this to sound rude), if you want to
eventually remove the block size maximum consensus rule entirely, I
will never be able to agree with you: not even in your wildest dreams.


More information about the bitcoin-dev mailing list