[Bitcoin-development] comments on BIP 100

Eric Lombrozo elombrozo at gmail.com
Mon Jun 15 00:53:05 UTC 2015


I think the whole complexity talk is missing the bigger issue.

Sure, per block validation scales linearly (or quasilinearly…there’s an O(log n) term in there somewhere but it’s probably dominated by linear factors at current levels…asymptotic limits don’t always apply very well to finite systems). And there’s an O(n^2) bandwidth issue.

The real issue, though, is validation cost. The n in O(n) here does not represent block size - it represents the size of the entire block chain for every new validator that must be synchronized! It means we have no way to construct short proofs (or at least arguments that are computationally *hard* to forge) without requiring the validator to maintain the complete system state. And currently, there is no mechanism for directly compensating validators.

A full validator that goes offline even for a short period of time takes a while to fully catch up to the rest of the network - and starting up a new validator from scratch will continue to be painful…even for those of us who’ve turned this into routine by now, let alone new nontechnical users.

Satoshi’s SPV is not a real solution - it’s a mere suggestion that wasn’t fully thought out at the time of the Bitcoin white paper. Besides lacking a good validation security model, practical implementations of it weaken privacy and complicate client implementations substantially…and the worst part, it still doesn’t scale all that well. The validator still has to query every single block (even if filtered) back to the first transaction (which cannot be determined without doing a blockchain scan anyway).

So yes, we will most certainly need algorithmic improvements!

- Eric Lombrozo


> On Jun 14, 2015, at 4:58 PM, Adam Back <adam at cypherspace.org> wrote:
> 
> Hi Mike
> 
> On 15 June 2015 at 00:23, Mike Hearn <mike at plan99.net> wrote:
>>> One thing that is concerning is that few in industry seem inclined to
>>> take any development initiatives or even integrate a library.
>> 
>> Um, you mean except all the people who have built more scalable wallets over
>> the past few years, which is the only reason anyone can even use Bitcoin
>> from their phone?
> 
> No slight intended obviously to people who do write actual client code.
> 
> That was probably insufficiently specific, let me rephrase: I am
> referring to the trend that much of the industry is built on web2.0
> technology using bitcoin via a library in a web scripting language,
> often with consensus bugs, and even outsourcing and not even running
> their own full node, so that the service itself offered to their users
> isn't even SPV secure to the operator.  As well as being heavily based
> on a third-party custody model that is the root cause of the repeated
> wallet breaches.  Some of these companies have a noted tendency not to
> upgrade or fix code.
> 
> So I mean this not to call out specific companies, but in the sense
> that if we're technologists we should be trying to move the technology
> forward, not just changing parameters which run into an O(n^2) scaling
> wall or break decentralisation security.  And we shouldnt take the
> above state of affairs as an immutable reality.  It can not persist
> for bitcoin to reach maturity on scale nor security.
> 
>> I still think you guys don't recognise what you are actually asking for here
>> - scrapping virtually the entire existing investment in software, wallets
>> and tools.
> 
> As I said I dont think we can expect Bitcoin to scale with no further
> algorithmic improvements.  Algorithmic improvements take code.  There
> is reasonable scope to build in an incrementally deployable way,
> there's plenty of time for people to code, test and opt-in to things,
> the sky is not falling.  Companies do care about scaling, and can
> invest in the integration and coding implied to improve their products
> scalability, they have an economic incentive to do it and there is no
> scalable and safe way todo it without this work.
> 
>> Computational complexity for the entire network is O(nm) where
>> n=transactions and m=fully validating nodes. There is no fixed relationships
>> between those two variables.
> 
> I am referring to global bandwidth O(n^2) with n=users, or O(n) per
> user bandwidth cost to the system, while O(nm) is accurate nodes is an
> internal system concept.  Anyway suffice to say the network does not
> scale O(1) in per user cost.
> 
>> "Exposing the companies to back-pressure" sounds quite nice and gentle. Let
>> me rephrase it to be equivalent but perhaps more direct: you mean "breaking
>> the current software ecosystem to force people into a new, fictional system
>> that bears little resemblance to the Bitcoin we use today, whether they want
>> that or not".
>> 
>> As nothing that has been proposed so far (Lightning, merge mined chains,
>> extension blocks etc) has much chance of actual deployment any time soon,
>> that leaves raising the block size limit as the only possible path left.
> 
> A hard-fork takes a long period of time to deploy due to the
> non-upgrade risk, people are working on things in the mean-time.
> There can be a case for some increase to create some breathing room to
> work on scaling and decentralising tech, I just mean to say that if we
> do it in isolation, we're not focussing on the big picture.
> 
> Adam
> 
> ------------------------------------------------------------------------------
> _______________________________________________
> Bitcoin-development mailing list
> Bitcoin-development at lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/bitcoin-development

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 842 bytes
Desc: Message signed with OpenPGP using GPGMail
URL: <http://lists.linuxfoundation.org/pipermail/bitcoin-dev/attachments/20150614/22d4865e/attachment.sig>


More information about the bitcoin-dev mailing list