Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> At the core of the debate is whether the Bitcoin blockchain should be a settlement layer that supports a number of new blockchains that can be scaled to achieve various goals or whether the Bitcoin blockchain itself should evolve in a way that it can scale to achieve those various goals.

No, among the developers actually working on Bitcoin that is not what the debate is about at all.

Bitcoin is a decentralized ledger, and indeed it can be argued that this is the only property about Bitcoin which is interesting/useful. Why? Because all properties we care about (availability, uncensorability, unseizability, etc.) derive from decentralization[0]. And at the end of the day we can do everything Bitcoin does faster, better, and cheaper on some alternative consensus system (see: Stellar, Open-Transactions, Liquid) that does not have this decentralization property. Decentralization is expensive. It requires a dynamic membership, multi-party block signing algorithm, which at the moment means proof of work. And proof of work costs hundreds of millions of dollars per year to maintain, and throttles the available bandwidth due to the adversarial assumption and the existence of selfish mining[1].

The question is not whether Bitcoin should be a store of value or a medium of exchange. That implies we have some choice in the matter. The question is what level of on-chain utility does Bitcoin actually support under untrusted, adversarial conditions, without losing all properties derived from decentralization. This is an empirical question. The available bandwidth is something that can be determined from the performance of the code in the real world extrapolated to various adversarial simulations.

We had two Scaling Bitcoin workshops last year that gave us a data-driven answer: 3-4MB per block, tops. There are potentially ways that this number can be improved (see: weak blocks), and those are being worked on but are still some time from showing results. There are also some assumptions underlying this number, e.g. that we change the validation cost metric, which none of the existing proposals do in a smart way. But the scientific process is telling us right now that with the tools available to us we can increase the worst-case block size to 3-4MB with a better metric without the decentralization story becoming unacceptably worse off.

That is the plan of Bitcoin Core. The deployment of segregated witness will allow up to 2MB blocks under typical conditions, and 3-4MB under worst-case adversarial conditions. It will exhaust the available capacity for growth in the Bitcoin network at this time. Meanwhile, work progresses on IBLT, weak blocks, Bitcoin-NG, fraud proofs and probabilistic validation, and other related technologies that might provide an answer for the next increase a year or two later. I'm hopeful we may even be able to get an order of magnitude improvement from that one, but we'll see.

No one I'm aware of is pushing for smaller blocks because Bitcoin should be a store of value and a settlement layer. If I had magic pixie dust I'd want 1GB blocks and everything on-chain too. But we live in the real world and are stuck in a situation where Bitcoin loses all of its unique properties if we scale much further than where we are at now. And so we must ask the question: what will Bitcoin become, since it can't scale on-chain? How can we live with that outcome? The idea of a settlement layer and off-chain but trustless payment networks like Lightning naturally arise from that thinking. The Lightning Network[2] is a way that we can have our cake and eat it too: Bitcoin remains small and decentralized, but everyone still has access to bitcoin payments. Lightning can potentially scale to global usage with a small-block chain as the settlement layer.

[0]: http://bluematt.bitcoin.ninja/2015/01/14/decentralization/

[1]: http://hackingdistributed.com/2013/11/04/bitcoin-is-broken/

[2]: http://lightning.network/



This is very interesting and another rephrasing of "what the debate is actually about!" At this point, as a complete outsider to Bitcoin I must say I am a bit confused. Here are the camps I am seeing:

Hearn seems to be saying that Bitcoin right now is not really decentralized - aka the Chinese firewall problem and so much power already centralized in a few peoples hands. He also seems to be saying that the motivations for Bitcoin Core developers are not based on good faith technical disagreements, but more about bad faith political disagreements.

Fred Wilson seems to be saying that he concurs with Hearn's concerns about Bitcoin already being centralized, but would characterize Bitcoin Core developers as having good faith political disagreements and advocates for "hard fork(s)" as a way to resolve these good faith political disagreements.

You seem to be saying that, in fact, BitCoin Core developers have good faith technical reasons - nay - empirically validated reasons to believe BitCoin has no real choice and that it can not ever become a visa-like transactional currency without destroying the decentralization qualities that make it so interesting. I assume you also believe that Hearn's qualms about the current centralization are wrong in some way.

So that makes for very conflicting stories about what is going on for this outsider to have any hope of judging. Very interesting though.

Can you tell me why the decentralized ledger can't just be made more granular so that increasing the block size might make kick out current under-powered nodes, but those nodes could be turned into virtual nodes where under-powered machines could pool resources to establish a virtual node and thus maintain decentralization?


> Hearn seems to be saying that Bitcoin right now is not really decentralized - aka the Chinese firewall problem and so much power already centralized in a few peoples hands.

This is true, and people are working to fix it on many levels. For example, separating the transaction selection from coinbase reward (what p2pool does, but with better scaling properties). Also, large chip manufacturers like BitFury selling all-in-one mining setups rather than running the miners themselves--they are presently doing mostly in-house mining, but are moving in the right direction. Also, smart property miners so that it doesn't matter if the hardware centralizes to where power is cheapest. There are also protocol-level changes along the lines of Bitcoin-NG, or non-outsourceable puzzles which remove a good deal of the mining centralization pressures.

So what you might not have gotten from the Hearn article is that this is a known problem and while scarily bad, work is progressing on this front.

> He also seems to be saying that the motivations for Bitcoin Core developers are not based on good faith technical disagreements, but more about bad faith political disagreements.

I don't know what to say to that. We had two open workshops last year for discussing the scaling problem, which Hearn did not participate in. We have a plan for increasing the capacity of the system which also references what we do genuinely believe to be the technical hurdles to scaling. I charitably hope that Hearn is simply ignorant on this matter.

> I assume you also believe that Hearn's qualms about the current centralization are wrong in some way.

No, the present state of mining centralization is concerning. But I'm not willing to throw in the towel on that just yet.

> Can you tell me why the decentralized ledger can't just be made more granular so that increasing the block size might make kick out current under-powered nodes, but those nodes could be turned into virtual nodes where under-powered machines could pool resources to establish a virtual node and thus maintain decentralization?

Decentralization in Bitcoin is about mining decentralization. Everything else barely matters. A cabal of miners can censor transactions costlessly. A cabal of miners can require extra-protocol information to de-anonymize bitcoin holders. A cabal of miners can keep anyone else from mining bitcoin, thereby closing off competitors. The threat to Bitcoin's decentralization is centralization of control over the mining hashpower -- whether from the form of pools, or external coercion on pools, or companies having exclusive or back-door access to mining hardware (e.g. 21co).

Outside of the back-doors, what causes mining centralization is a combination of validation cost, latency, and external factors like electrical power subsidies.

But, having said that, what you propose is known in the community as 'probabilistic validation' -- only checking part of the contents of a block and being able to relay small fraud proofs when such validation fails. This requires the ability to create such fraud proofs, which we will get with segregated witness (being worked on by Bitcoin Core right now), and with a lot work will eventually get us to the point where validation cost is not concern so long as there is a large, healthy network of probabilistically validating full nodes.

But we'd still have latency and real world power economies to worry about.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: