[bitcoin-dev] Hardfork to fix difficulty drop algorithm

Bob McElrath bob_bitcoin at mcelrath.org
Wed Mar 9 20:21:36 UTC 2016


Dave Hudson [dave at hashingit.com] wrote:
> A damping-based design would seem like the obvious choice (I can think of a
> few variations on a theme here, but most are found in the realms of control
> theory somewhere).  The problem, though, is working working out a timeframe
> over which to run the derivative calculations.

>From a measurement theory perspective this is straightforward.  Each block is a
measurement, and error propagation can be performed to derive an error on the
derivatives.

The statistical theory of Bitcoin's block timing is known as a Poisson Point
Process: https://en.wikipedia.org/wiki/Poisson_point_process or temporal point
process.  If you google those plus "estimation" you'll find a metric shit-ton of
literature on how to handle this.

> The problem is the measurement of the hashrate, which is pretty inaccurate at
> best because even 2016 events isn't really enough (with a completely constant
> hash rate running indefinitely we'd see difficulty swings of up to +/- 5% even
> with the current algorithm).  In order to meaningfully react to a major loss
> of hashing we'd still need to be considering a window of probably 2 weeks.

You don't want to assume it's constant in order to get a better measurement.
The assumption is clearly false.  But, errors can be calculated, and retargeting
can take errors into account, because no matter what we'll always be dealing
with a finite sample.

Personally I don't think difficulty target variations are such a big deal, if
the algorithm targets that over any long time interval, the average block time
is 10 min.  Bitcoin's current algorithm fails here, with increasing hashrate (as
we have), it issues coins faster than its assumed schedule.

--
Cheers, Bob McElrath

"For every complex problem, there is a solution that is simple, neat, and wrong."
    -- H. L. Mencken 



More information about the bitcoin-dev mailing list