Schnelli explained this issue during his recent talk in Zurich:
“There are consequences with 2-megabyte blocks. Chinese miners — they are now [for] 2- megabyte blocks, but maybe it will turn out to be a problem for them . . . Every second really counts . . . When you mine a block that is no longer valid and you don’t get the information that a new block is here, you’re wasting lots of energy. If it’s just ten seconds you mine on the wrong block, you lose energy, and you lose coins in the end. That’s why, with Chinese miners [especially], every second counts, and [with] 2-megabyte [blocks], it’s twice the bandwidth you need.”
A key point to understand about bitcoin mining is every second of hashing affects one’s ability to turn a profit. New blocks are not received by all nodes on the network instantaneously, which means miners are, at least at times, wasting resources by building on an old block that is no longer the most recent. After all, a miner can only build on top of someone else’s found block after he knows that block exists.
Todd noted that losses are lower in reality due to Blockstream Core Tech Engineer and Bitcoin Core Contributor Matt Corallo’s Bitcoin Relay Network, and it should also be pointed out that Pieter Wuille’s work was testing 20-megabyte blocks.
It should be noted that Schnelli has decided not to take an official, public stance on the block size debate.
Due to the way the Great Firewall works, miners in China often find out about new blocks before miners in other countries (especially across the world in the United States). Since China also currently holds a majority of the hashing power on the network, miners who are not in China end up losing out on a bit of revenue. This is due to the fact that, on average, miners outside of China will hear about new blocks later than miners inside of China, which means non-Chinese miners waste more resources on blocks that have already been found.
One of the last points made by Schnelli at Bitcoin Meetup Switzerland is that the issue of scalability is not as simple as some have made it out to be.
This is not the first time a Bitcoin Core contributor has talked about the issue of block propagation in terms of the mining process. Multiple developers discussed this problem in interviews during the leadup to Scaling Bitcoin Montreal.
At the Bitcoin Foundation’s DevCore Workshop back in October, Bitcoin Core Developer Gregory Maxwell explained that the second-to-last mining pool to learn about a new block is currently dealing with a 5 percent orphan rate.
What’s wrong with increasing the block size limit? This is the question that a portion of the Bitcoin community has been asking almost nonstop since the controversy around this possible alteration to the protocol went into hyperdrive last year.
In the past, Bitcoin Core Contributor Peter Todd also has discussed this issue. During his presentation at Scaling Bitcoin Montreal, Todd explained how lousy block propagation becomes more problematic when the Great Firewall of China is factored into the equation.
“We’ve done various simulation results. A big one that works out very well is Pieter Wuille’s work where we’ve gone and shown that — and he actually used realistic mining and latency networks with this where when you look at the situation in China, for the amount of time it takes data to propagate over the Great Firewall of China and their relative hashing power percentage — people who are not part of that group are earning something about like eight percent less revenue.”
Scaling Bitcoin Is Not Simple
“I don’t want to say I’m looking behind every curtain, but if you don’t really go down to the technical fundamentals it’s easy to say, ‘Increase the block size.’ Sure. Sounds nice. Everybody can understand it. But there are better solutions that maybe take more energy to think about.”
Two possible solutions recently brought up by Bitcoin Classic Developer Gavin Andresen on this issue are UDP broadcast of block headers and validationless mining. Bitcoin Security Consultant Sergio Lerner recently wrote a blog post on the latter of the two options.
On a related note, there’s a theorized vulnerability in Bitcoin mining, known as selfish mining, where a miner may decide to not let others know about a block they found in order to give themselves a head start on finding the next block.
It should also be mentioned that, as Blockchain Capital Managing Partner Brock Pierce recently pointed out, China’s control over the majority of hashing power may not last forever.
There are a few proposed solutions that could solve the issue of slow block propagation on the Bitcoin network. Bitcoin Core’s current roadmap includes two such solutions: invertible bloom lookup tables (IBLTs) and weak blocks. According to the Bitcoin Core website, these two features can offer a 90 percent reduction in critical bandwidth when relaying blocks, which should allow for a safer increase of the block size limit.
There are also other proposed solutions for this issue, but the point is that plenty of smart people are working on potential fixes. Based on Bitcoin Core’s roadmap, it appears that IBLTs and weak blocks are the most likely solutions to get implemented first.
One of the original founders of Bitcoin Classic, Jonathon Toomim, also presented on the issues related to block propagation with bigger blocks at Scaling Bitcoin Hong Kong. His testing focused on the now-withdrawn BIP 101 proposal, and he concluded that the increase to 8 megabytes would not be appropriate. During his tests, he found it took anywhere between 15 and 150 seconds to send block data to another peer when the two parties were on opposite sides of the Great Firewall of China.
Like many other developers involved with Bitcoin Core, Schnelli views Segregated Witness (SegWit) as a viable alternative to simply increasing the block size limit. Bitcoin Core Contributor Eric Lombrozo recently outlined five benefits of the SegWit proposal at Blockchain Agenda San Diego.
Todd pointed to some past research to illustrate his point during his Scaling Bitcoin talk:
In a recent appearance at Bitcoin Meetup Switzerland, Bitcoin Core Contributor Jonas Schnelli covered at least one possible issue with raising the block size limit too quickly: the effect larger blocks have on wasted resources for miners.
The point here is large miners have an added advantage over small miners due to the time it takes for miners to learn about new blocks. If the block size limit were increased, it would take longer for blocks to propagate around the network, thus increasing this advantage.
UK Government Lures Distributed Ledger Projects With $26 Million Fund
The United Kingdom’s technology development arm, Innovate UK, is doubling down on its support for blockchain-based technologies, it announced on Jan. 22.
The nondepartmental public office said it w
Bitcoin Forking Madness Could Result in 50 Splits This Year
Bitcoin is all the rage — but is it worth the risk?
UN Forms Blockchain Coalition to Fight Climate Change
The United Nations is increasing its efforts to promote applications of blockchain technology to further action on climate change.
In a Jan. 22 blog post, the U.N. announced that it has helped laun
Goldman Sachs Report Warns Investors of Bitcoin 'Bubble'
Goldman Sachs has claimed that bitcoin is a bubble bigger than the dot-com era and the famous Dutch tulip mania.
In a research letter to investors, the banking firm's analysts warned about the incr
Malta Finance Watchdog Pushes Ahead With Crypto Fund Rules
Malta’s Financial Services Authority (MFSA) published the feedback it received on its proposed rules for collective investment schemes involving cryptocurrencies on Monday.
The MFSA first sought fe
Virginia Lawmaker Calls for Crypto Impact Study
A state senator in the U.S. state of Virginia has filed new legislation that would mandate an impact study on cryptocurrencies.
A new bill introduced by Glen Sturtevant would launch a study, which,
EU Commissioner to Host 'High Level' Crypto Roundtable
A European Union commissioner plans to hold a meeting of public and private sector stakeholders to discuss the impact of cryptocurrencies on central banks.
In remarks at a press conference for the