[Bitcoin-ml] Difficulty adjustment

Tom Zander tomz at freedommail.ch
Fri Sep 1 10:12:23 UTC 2017

Its hard to comment on a proposal without seeing the actual algorithm. The 
output is interesting, but I’d like to see the actual code that generated 

As far as I can tell, the *only* goal of this project was to stabilize the 
block time in case of severe changes in hashrate.

This ignores many other requirements, one of those is that miners need 
stability as well. They need to know how much electricity will be needed to 
get a certain amount of block rewards.
The original Satoshi one had this stable at 2 weeks, it looks like in your 
idea (again, can’t say without the actual algorithm) can’t give more than an 
hour or so.

I’ll wait for an actual proposal before having a real opinion, but I want to 
point out that I would be against a proposal that throws out the stability 
of everything but the block-times.
This opens us up to attackss, for instance ddossing your opposing mining 
pool off the net will have an immediate and measuarable increase in profits 
for you.

On top of that, the “death spiral” that people predicted for BTC has not 
happened, maybe the original adjustment of 2016 blocks wasn’t so bad?

On Friday, 25 August 2017 21:25:05 CEST Amaury Séchet wrote:
> We can use this to define a base target by computing the work done over
> the past 2016 blocks and the time it took. Because individual block time
> variation do not matter much over 2016 blocks, this gives a very stable
> output. However, it is unable to adjust quickly when required. So we also
> want to compute a fast target, on a much smaller timeframe. I found 130
> minutes to work well and a minimum of 5 blocks to make sure we aren't
> undersampling. Because we weighted the average based on block time, it is
> also useful to not chose a specific number of block but rather focus on a
> timeframe rather than a block count.
> When the fast target is withing 25% of the base target, we can simply
> assume it is variance and just use the base target. When it isn't, we are
> either facing a large hashrate change, or a especially high variance. In
> both cases, it is worth using the fast target to ensure more stability in
> block issuance. For instance, if variance cause several blocks to come
> very slow, then it is worth dropping the difficulty temporarily to make
> sure a backlog doesn't form.
> This give us a difficulty function that is very stable when hashrate is
> stable, but that is able to nudge left and right to compensate for
> variance, and that is also able to adjust very quickly to large hashrate
> changes. I ran various simulations and it looks pretty good.

Tom Zander
Blog: https://zander.github.io
Vlog: https://vimeo.com/channels/tomscryptochannel

More information about the bitcoin-ml mailing list