[bitcoin-dev] Scaling Bitcoin conference micro-report

Mark Friedenbach mark at friedenbach.org
Fri Sep 18 05:55:55 UTC 2015


Correction of a correction, in-line:

On Wed, Sep 16, 2015 at 5:51 PM, Matt Corallo via bitcoin-dev <
bitcoin-dev at lists.linuxfoundation.org> wrote:

> > - Many interested or at least willing to accept a "short term bump", a
> > hard fork to modify block size limit regime to be cost-based via
> > "net-utxo" rather than a simple static hard limit.  2-4-8 and 17%/year
> > were debated and seemed "in range" with what might work as a short term
> > bump - net after applying the new cost metric.
>
> I would be careful to point out that hard numbers were deliberately NOT
> discussed. Though some general things were thrown out, they were not
> extensively discussed nor agreed to. I personally think 2-4 is "in
> range", though 8 maybe not so much. Of course it depends on exactly how
> the non-blocksize limit accounting/adjusting is done.
>
> Still, the "greatest common denominator" agreement did not seem to be
> agreeing to an increase which continues over time, but which instead
> limits itself to a set, smooth increase for X time and then requires a
> second hardfork if there is agreement on a need for more blocksize at
> that point.
>

Perhaps it is accurate to say that there wasn't consensus at all except
that (1) we think we can work together on resolving this impasse (yay!),
and (2) it is conceivable that changing from block size to some other
metric might provide the basis for a compromise on near-term numbers.

As an example, I do not think the net-UTXO metric provides any benefit with
respect to scalability, and in some ways makes the situation worse (even
though it helpfully solves an unrelated problem of spammy dust outputs).
But there are other possible metrics and I maintain hope that data will
show the benefit of another metric or other metrics combined with net-UTXO
in a way that will allow us to reach consensus.

As a further example, I also am quite concerned about 2-4-8MB with either
block size or net-UTXO as the base metric. As you say, it depends on how
the non-blocksize limit accounting/adjusting is done... But if a metric
were chosen that addressed my concerns (worst case propagation and
validation time), then I could be in favor of an initial bump that allowed
a larger number of typical transactions in a block.

But where I really need to disagree is on the requirement for a 2nd hard
fork. I will go on record as being definitively against this. While being
conservative with respect to exponentials, I would very much like to make
sure that there is a long-term growth curve as part of any proposal. I am
willing to accept a hard-fork if the adopted plan is too conservative, but
I do not want to be kicking the can down the road to a scheduled 2nd hard
fork that absolutely must occur. That, I feel, could be a more dangerous
outcome than an exponential that outlasts conservative historical trends.

I commend Jeff for writing a Chatham-rules summary of the outcome of some
hallway conversations that occurred. On the whole I think his summary does
represent the majority view of the opinions expressed by core developers at
the workshop. I will caution though that on nearly every issue there were
those expressed disagreement but did not fight the issue, and those who
said nothing and left unpolled opinions. Nevertheless this summary is
informative as it feeds forwards into the design of proposals that will be
made prior to the Hong Kong workshop in December, in order that they have a
higher likelihood of success.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.linuxfoundation.org/pipermail/bitcoin-dev/attachments/20150918/ddd28331/attachment.html>


More information about the bitcoin-dev mailing list