32,5 Terawattstunden im Jahr: Miese Öko-Bilanz: Suche nach ...

BitTorrent Creator Bram Cohen: Bitcoin Miners are Butthurt Over SegWit - Coinjournal

BitTorrent Creator Bram Cohen: Bitcoin Miners are Butthurt Over SegWit - Coinjournal submitted by Ocarding to Bitcoin [link] [comments]

BitTorrent Creator Bram Cohen: Bitcoin Miners are Butthurt Over SegWit - CoinJournal

BitTorrent Creator Bram Cohen: Bitcoin Miners are Butthurt Over SegWit - CoinJournal submitted by Ocarding to btc [link] [comments]

BitTorrent Creator Bram Cohen: Bitcoin Miners are Butthurt Over SegWit - CoinJournal

BitTorrent Creator Bram Cohen: Bitcoin Miners are Butthurt Over SegWit - CoinJournal submitted by Ocarding to BitcoinMining [link] [comments]

BitTorrent Creator Bram Cohen: Bitcoin Miners are Butthurt Over SegWit

BitTorrent Creator Bram Cohen: Bitcoin Miners are Butthurt Over SegWit submitted by MartianMnM to Crypto_Currency_News [link] [comments]

BitTorrent Creator Bram Cohen: Bitcoin Miners are Butthurt Over SegWit - CoinJournal

BitTorrent Creator Bram Cohen: Bitcoin Miners are Butthurt Over SegWit - CoinJournal submitted by BitcoinAllBot to BitcoinAll [link] [comments]

BitTorrent Creator Bram Cohen: Bitcoin Miners are Butthurt Over SegWit - Coinjournal

This is the best tl;dr I could make, original reduced by 70%. (I'm a bot)
During a conversation with Muneeb Ali of Blockstack at Blockstack Summit 2017., BitTorrent inventor Bram Cohen shared some of his thoughts on the current state of the Bitcoin development process.
Cohen's response illustrated a preference for backwards compatible changes, which generally come in the form of soft forks in Bitcoin.
In Cohen's view, the current issues with the development of the Bitcoin protocol are "Pretty easy to identify."
Here, it appeared Cohen was referring to bitcoin miners not updating their software to activate SegWit until many months after it had been included in a version of Bitcoin Core.
In Cohen's view, SegWit was a remarkably good proposal from the contributors to Bitcoin Core.
In Cohen's view, the "Vitriol" that is thrown at many of the contributors to Bitcoin Core is unwarranted because they've been doing normal development and "Getting stuff done." He noted that the source of this criticism for Bitcoin Core contributors seems to come from those developers not wanting to implement bad ideas.
Summary Source | FAQ | Feedback | Top keywords: Cohen#1 Bitcoin#2 change#3 Core#4 view#5
Post found in /Bitcoin, /btc, /BitcoinAll, /Crypto_Currency_News, /CryptoCurrency and /BitcoinMining.
NOTICE: This thread is for discussing the submission topic. Please do not discuss the concept of the autotldr bot here.
submitted by autotldr to autotldr [link] [comments]

BitTorrent Inventor Bram Cohen Will Start His Own Cryptocurrency

BitTorrent Inventor Bram Cohen Will Start His Own Cryptocurrency submitted by gulfbitcoin to Bitcoin [link] [comments]

Increasing Decentralization Through Built-in Pooling?

I've read a number of discussions lately about the decentralization of Bitcoin and Monero, along with the danger of having a few large mining pools. This led me to research some of the various ways other protocols are thinking about this problem and I came across one that I found quite interesting.
"We're going to have a system where there are pools but they're only responsible for signing coinbase rewards and the individual miners in the pool are still in charge of making actual blocks, so you get the positive benefits of pooling (smoothing of rewards) without the negatives (pools having too much political power).
It's advantageous to use the built-in pooling protocol instead of hacking something with the keys because the built-in protocol allows [miners] to work even when the pool is offline." -Bram Cohen
This solution gives all transaction fees to the block creator but shares coinbase reward with other members of the pool. These systems are built in to the Chia Network protocol with "non-outsourceability" (I don't know what this is) in order to foster a multitude of pools.
All of this is theoretical right now since Chia hasn't released any code or a test net, but I was curious what the XMR community thought and whether this might be something worth investigating further?
submitted by EWBears to Monero [link] [comments]

Long live decentralized bitcoin(!) A reading list

Newbs might not know this, but bitcoin recently came out of an intense internal drama. Between July 2015 and August 2017 bitcoin was attacked by external forces who were hoping to destroy the very properties that made bitcoin valuable in the first place. This culminated in the creation of segwit and the UASF (user activated soft fork) movement. The UASF was successful, segwit was added to bitcoin and with that the anti-decentralization side left bitcoin altogether and created their own altcoin called bcash. Bitcoin's price was $2500, soon after segwit was activated the price doubled to $5000 and continued rising until a top of $20000 before correcting to where we are today.
During this drama, I took time away from writing open source code to help educate and argue on reddit, twitter and other social media. I came up with a reading list for quickly copypasting things. It may be interesting today for newbs or anyone who wants a history lesson on what exactly happened during those two years when bitcoin's very existence as a decentralized low-trust currency was questioned. Now the fight has essentially been won, I try not to comment on reddit that much anymore. There's nothing left to do except wait for Lightning and similar tech to become mature (or better yet, help code it and test it)
In this thread you can learn about block sizes, latency, decentralization, segwit, ASICBOOST, lightning network and all the other issues that were debated endlessly for over two years. So when someone tries to get you to invest in bcash, remind them of the time they supported Bitcoin Unlimited.
For more threads like this see UASF

Summary / The fundamental tradeoff

A trip to the moon requires a rocket with multiple stages by gmaxwell (must read) https://www.reddit.com/Bitcoin/comments/438hx0/a_trip_to_the_moon_requires_a_rocket_with/
Bram Cohen, creator of bittorrent, argues against a hard fork to a larger block size https://medium.com/@bramcohen/bitcoin-s-ironic-crisis-32226a85e39f#.558vetum4
gmaxwell's summary of the debate https://bitcointalk.org/index.php?topic=1343716.msg13701818#msg13701818
Core devs please explain your vision (see luke's post which also argues that blocks are already too big) https://www.reddit.com/Bitcoin/comments/61yvvv/request_to_core_devs_please_explain_your_vision/
Mod of btc speaking against a hard fork https://www.reddit.com/btc/comments/57hd14/core_reaction_to_viabtc_this_week/d8scokm/
It's becoming clear to me that a lot of people don't understand how fragile bitcoin is https://www.reddit.com/Bitcoin/comments/59kflj/its_becoming_clear_to_me_that_a_lot_of_people/
Blockchain space must be costly, it can never be free https://www.reddit.com/Bitcoin/comments/4og24h/i_just_attended_the_distributed_trade_conference/
Charlie Lee with a nice analogy about the fundamental tradeoff https://medium.com/@SatoshiLite/eating-the-bitcoin-cake-fc2b4ebfb85e#.444vr8shw
gmaxwell on the tradeoffs https://bitcointalk.org/index.php?topic=1520693.msg15303746#msg15303746
jratcliff on the layering https://www.reddit.com/btc/comments/59upyh/segwit_the_poison_pill_for_bitcoin/d9bstuw/

Scaling on-chain will destroy bitcoin's decentralization

Peter Todd: How a floating blocksize limit inevitably leads towards centralization [Feb 2013] https://bitcointalk.org/index.php?topic=144895.0 mailing list https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2013-February/002176.html with discussion on reddit in Aug 2015 https://www.reddit.com/Bitcoin/comments/3hnvi8/just_a_little_history_lesson_for_everyone_new_the/
Nick Szabo's blog post on what makes bitcoin so special http://unenumerated.blogspot.com/2017/02/money-blockchains-and-social-scalability.html
There is academic research showing that even small (2MB) increases to the blocksize results in drastic node dropoff counts due to the non-linear increase of RAM needed. http://bravenewcoin.com/assets/Whitepapers/block-size-1.1.1.pdf
Reddit summary of above link. In this table, you can see it estimates a 40% drop immediately in node count with a 2MB upgrade and a 50% over 6 months. At 4mb, it becomes 75% immediately and 80% over 6 months. At 8, it becomes 90% and 95%. https://www.reddit.com/Bitcoin/comments/5qw2wa_future_led_by_bitcoin_unlimited_is_a/dd442pw/
Larger block sizes make centralization pressures worse (mathematical) https://petertodd.org/2016/block-publication-incentives-for-miners
Talk at scalingbitcoin montreal, initial blockchain synchronization puts serious constraints on any increase in the block size https://www.youtube.com/watch?v=TgjrS-BPWDQ&t=2h02m06s with transcript https://scalingbitcoin.org/transcript/montreal2015/block-synchronization-time
Bitcoin's P2P Network: The Soft Underbelly of Bitcoin https://www.youtube.com/watch?v=Y6kibPzbrIc someone's notes: https://gist.github.com/romyilano/5e22394857a39889a1e5 reddit discussion https://www.reddit.com/Bitcoin/comments/4py5df/so_f2pool_antpool_btcc_pool_are_actually_one_pool/
In adversarial environments blockchains dont scale https://scalingbitcoin.org/transcript/hongkong2015/in-adversarial-environments-blockchains-dont-scale
Why miners will not voluntarily individually produce smaller blocks https://scalingbitcoin.org/transcript/hongkong2015/why-miners-will-not-voluntarily-individually-produce-smaller-blocks
Hal Finney: bitcoin's blockchain can only be a settlement layer (mostly interesting because it's hal finney and its in 2010) https://www.reddit.com/Bitcoin/comments/3sb5nj/most_bitcoin_transactions_will_occur_between/
petertodd's 2013 video explaining this https://www.youtube.com/watch?v=cZp7UGgBR0I
luke-jr's summary https://www.reddit.com/Bitcoin/comments/61yvvv/request_to_core_devs_please_explain_your_vision/dficjhj/
Another jratcliff thread https://www.reddit.com/Bitcoin/comments/6lmpll/explaining_why_big_blocks_are_bad/

Full blocks are not a disaster

Blocks must be always full, there must always be a backlog https://medium.com/@bergealex4/bitcoin-is-unstable-without-the-block-size-size-limit-70db07070a54#.kh2vi86lr
Same as above, the mining gap means there must always be a backlog talk: https://www.youtube.com/watch?time_continue=2453&v=iKDC2DpzNbw transcript: https://scalingbitcoin.org/transcript/montreal2015/security-of-diminishing-block-subsidy
Backlogs arent that bad https://www.reddit.com/Bitcoin/comments/49p011/was_the_fee_event_really_so_bad_my_mind_is/
Examples where scarce block space causes people to use precious resources more efficiently https://www.reddit.com/Bitcoin/comments/4kxxvj/i_just_singlehandedly_increased_bitcoin_network/
Full blocks are fine https://www.reddit.com/Bitcoin/comments/5uld1a/misconception_full_blocks_mean_bitcoin_is_failing/
High miner fees imply a sustainable future for bitcoin https://www.reddit.com/BitcoinMarkets/comments/680tvf/fundamentals_friday_week_of_friday_april_28_2017/dgwmhl7/
gmaxwell on why full blocks are good https://www.reddit.com/Bitcoin/comments/6b57ca/full_blocks_good_or_bad/dhjxwbz/
The whole idea of the mempool being "filled" is wrong headed. The mempool doesn't "clog" or get stuck, or anything like that. https://www.reddit.com/Bitcoin/comments/7cusnx/to_the_people_still_doubting_that_this_congestion/dpssokf/


What is segwit

luke-jr's longer summary https://www.reddit.com/Bitcoin/comments/6033h7/today_is_exactly_4_months_since_the_segwit_voting/df3tgwg/?context=1
Charlie Shrem's on upgrading to segwit https://twitter.com/CharlieShrem/status/842711238853513220
Original segwit talk at scalingbitcoin hong kong + transcript https://youtu.be/zchzn7aPQjI?t=110
Segwit is not too complex https://www.reddit.com/btc/comments/57vjin/segwit_is_not_great/d8vos33/
Segwit does not make it possible for miners to steal coins, contrary to what some people say https://www.reddit.com/btc/comments/5e6bt0/concerns_with_segwit_and_anyone_can_spend/daa5jat/?context=1
Segwit is required for a useful lightning network It's now known that without a malleability fix useful indefinite channels are not really possible.
Clearing up SegWit Lies and Myths: https://achow101.com/2016/04/Segwit-FUD-Clearup
Segwit is bigger blocks https://www.reddit.com/Bitcoin/comments/5pb8vs/misinformation_is_working_54_incorrectly_believe/dcpz3en/
Typical usage results in segwit allowing capacity equivalent to 2mb blocks https://www.reddit.com/Bitcoin/comments/69i2md/observe_for_yourself_segwit_allows_2_mb_blocks_in/

Why is segwit being blocked

Jihan Wu (head of largest bitcoin mining group) is blocking segwit because of perceived loss of income https://www.reddit.com/Bitcoin/comments/60mb9e/complete_high_quality_translation_of_jihans/
Witness discount creates aligned incentives https://segwit.org/why-a-discount-factor-of-4-why-not-2-or-8-bbcebe91721e#.h36odthq0 https://medium.com/@SegWit.co/what-is-behind-the-segwit-discount-988f29dc1edf#.sr91dg406
or because he wants his mining enterprise to have control over bitcoin https://www.reddit.com/Bitcoin/comments/6jdyk8/direct_report_of_jihan_wus_real_reason_fo

Segwit is being blocked because it breaks ASICBOOST, a patented optimization used by bitmain ASIC manufacturer

Details and discovery by gmaxwell https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2017-April/013996.html
Reddit thread with discussion https://www.reddit.com/Bitcoin/comments/63otrp/gregory_maxwell_major_asic_manufacturer_is/
Simplified explaination by jonny1000 https://www.reddit.com/Bitcoin/comments/64qq5g/attempted_explanation_of_the_alleged_asicboost/
Evidence https://www.reddit.com/Bitcoin/comments/63yo27/some_circumstantial_evidence_supporting_the_claim/
Bitmain admits their chips have asicboost but they say they never used it on the network (haha a likely story) https://blog.bitmain.com/en/regarding-recent-allegations-smear-campaigns/
Worth $100m per year to them (also in gmaxwell's original email) https://twitter.com/petertoddbtc/status/849798529929424898
Other calculations show less https://medium.com/@vcorem/the-real-savings-from-asicboost-to-bitmaintech-ff265c2d305b
This also blocks all these other cool updates, not just segwit https://www.reddit.com/Bitcoin/comments/63otrp/gregory_maxwell_major_asic_manufacturer_is/dfw0ej3/
Summary of bad consequences of asicboost https://www.reddit.com/Bitcoin/comments/64qq5g/attempted_explanation_of_the_alleged_asicboost/dg4hyqk/?context=1
Luke's summary of the entire situation https://www.reddit.com/Bitcoin/comments/6ego3s/why_is_killing_asicboost_not_a_priority/diagkkb/?context=1
Prices goes up because now segwit looks more likely https://twitter.com/TuurDemeestestatus/849846845425799168
Asicboost discovery made the price rise https://twitter.com/TuurDemeestestatus/851520094677200901
A pool was caught red handed doing asicboost, by this time it seemed fairly certain that segwit would get activated so it didnt produce as much interest as earlier https://www.reddit.com/Bitcoin/comments/6p7lr5/1hash_pool_has_mined_2_invalid_blocks/ and https://www.reddit.com/Bitcoin/comments/6p95dl/interesting_1hash_pool_mined_some_invalid_blocks/ and https://twitter.com/petertoddbtc/status/889475196322811904
This btc user is outraged at the entire forum because they support Bitmain and ASICBOOST https://www.reddit.com/btc/comments/67t43y/dragons_den_planned_smear_campaign_of_bitmain/dgtg9l2/
Antbleed, turns out Bitmain can shut down all its ASICs by remote control: http://www.antbleed.com/

What if segwit never activates

What if segwit never activates? https://www.reddit.com/Bitcoin/comments/6ab8js/transaction_fees_are_now_making_btc_like_the_banks/dhdq3id/ with https://www.reddit.com/Bitcoin/comments/5ksu3o/blinded_bearer_certificates/ and https://www.reddit.com/Bitcoin/comments/4xy0fm/scaling_quickly/


bitcoinmagazine's series on what lightning is and how it works https://bitcoinmagazine.com/articles/understanding-the-lightning-network-part-building-a-bidirectional-payment-channel-1464710791/ https://bitcoinmagazine.com/articles/understanding-the-lightning-network-part-creating-the-network-1465326903/ https://bitcoinmagazine.com/articles/understanding-the-lightning-network-part-completing-the-puzzle-and-closing-the-channel-1466178980/
The Lightning Network ELIDHDICACS (Explain Like I Don’t Have Degrees in Cryptography and Computer Science) https://letstalkbitcoin.com/blog/post/the-lightning-network-elidhdicacs
Ligtning will increases fees for miners, not lower them https://medium.com/lightning-resources/the-lightning-paradox-f15ce0e8e374#.erfgunumh
Cost-benefit analysis of lightning from the point of view of miners https://medium.com/@rusty_lightning/miners-and-bitcoin-lightning-a133cd550310#.x42rovlg8
Routing blog post by rusty https://medium.com/@rusty_lightning/routing-dijkstra-bellman-ford-and-bfg-7715840f004 and reddit comments https://www.reddit.com/Bitcoin/comments/4lzkz1/rusty_russell_on_lightning_routing_routing/
Lightning protocol rfc https://github.com/lightningnetwork/lightning-rfc
Blog post with screenshots of ln being used on testnet https://medium.com/@btc_coach/lightning-network-in-action-b18a035c955d video https://www.youtube.com/watch?v=mxGiMu4V7ns
Video of sending and receiving ln on testnet https://twitter.com/alexbosworth/status/844030573131706368
Lightning tradeoffs http://www.coindesk.com/lightning-technical-challenges-bitcoin-scalability/
Beer sold for testnet lightning https://www.reddit.com/Bitcoin/comments/62uw23/lightning_network_is_working_room77_is_accepting/ and https://twitter.com/MrHodl/status/848265171269283845
Lightning will result in far fewer coins being stored on third parties because it supports instant transactions https://medium.com/@thecryptoconomy/the-barely-discussed-incredible-benefit-of-the-lightning-network-4ce82c75eb58
jgarzik argues strongly against LN, he owns a coin tracking startup https://twitter.com/petertoddbtc/status/860826532650123264 https://twitter.com/Beautyon_/status/886128801926795264
luke's great debunking / answer of some misinformation questions https://www.reddit.com/Bitcoin/comments/6st4eq/questions_about_lightning_network/dlfap0u/
Lightning centralization doesnt happen https://www.reddit.com/Bitcoin/comments/6vzau5/reminder_bitcoins_key_strength_is_in_being/dm4ou3v/?context=1
roasbeef on hubs and charging fees https://twitter.com/roasbeef/status/930209165728825344 and https://twitter.com/roasbeef/status/930210145790976000

Immutability / Being a swiss bank in your pocket / Why doing a hard fork (especially without consensus) is damaging

A downside of hard forks is damaging bitcoin's immutability https://www.reddit.com/Bitcoin/comments/5em6vu/what_happens_if_segwit_doesnt_activate/dae1r6c/?context=3
Interesting analysis of miners incentives and how failure is possible, don't trust the miners for long term https://www.reddit.com/Bitcoin/comments/5gtew4/why_an_increased_block_size_increases_the_cost_of/daybazj/?context=2
waxwing on the meaning of cash and settlement https://www.reddit.com/Bitcoin/comments/5ei7m3/unconfirmed_transactions_60k_total_fees_14btc/dad001v/
maaku on the cash question https://www.reddit.com/Bitcoin/comments/5i5iq5/we_are_spoiled/db5luiv/?context=1
Digital gold funamentalists gain nothing from supporting a hard fork to larger block sizes https://www.reddit.com/Bitcoin/comments/5xzunq/core_please_compromise_before_we_end_up_with_bu/dem73xg/?context=1
Those asking for a compromise don't understand the underlying political forces https://www.reddit.com/Bitcoin/comments/6ef7wb/some_comments_on_the_bip148_uasf_from_the/dia236b/?context=3
Nobody wants a contentious hard fork actually, anti-core people got emotionally manipulated https://www.reddit.com/Bitcoin/comments/5sq5ocontentious_forks_vs_incremental_progress/ddip57o/
The hard work of the core developers has kept bitcoin scalable https://www.reddit.com/Bitcoin/comments/3hfgpo/an_initiative_to_bring_advanced_privacy_features/cu7mhw8?context=9
Recent PRs to improve bitcoin scaleability ignored by the debate https://twitter.com/jfnewbery/status/883001356168167425
gmaxwell against hard forks since 2013 https://bitcointalk.org/index.php?topic=140233.20
maaku: hard forks are really bad https://www.reddit.com/Bitcoin/comments/5zxjza/adam_greg_core_devs_and_big_blockers_now_is_the/df275yk/?context=2

Some metrics on what the market thinks of decentralization and hostile hard forks

The price history shows that the exchange rate drops every time a hard fork threatens: https://i.imgur.com/EVPYLR8.jpg
and this example from 2017 https://twitter.com/WhalePanda/status/845562763820912642
http://imgur.com/a/DuHAn btc users lose money
price supporting theymos' moderation https://i.imgur.com/0jZdF9h.png
old version https://i.imgur.com/BFTxTJl.png
older version https://pbs.twimg.com/media/CxqtUakUQAEmC0d.jpg
about 50% of nodes updated to the soft fork node quite quickly https://imgur.com/O0xboVI

Bitcoin Unlimited / Emergent Consensus is badly designed, changes the game theory of bitcoin

Bitcoin Unlimited was a proposed hard fork client, it was made with the intention to stop segwit from activating
A Future Led by Bitcoin Unlimited is a Centralized Future https://blog.sia.tech/a-future-led-by-bitcoin-unlimited-is-a-centralized-future-e48ab52c817a#.p1ly6hldk
Flexible transactions are bugged https://www.reddit.com/Bitcoin/comments/57tf5g/bitcoindev_bluematt_on_flexible_transactions/
Bugged BU software mines an invalid block, wasting 13 bitcoins or $12k
bitcoin.com employees are moderators of btc https://medium.com/@WhalePanda/the-curious-relation-between-bitcoin-com-anti-segwit-propaganda-26c877249976#.vl02566k4
miners don't control stuff like the block size http://hackingdistributed.com/2016/01/03/time-for-bitcoin-user-voice/
even gavin agreed that economic majority controls things https://www.reddit.com/Bitcoin/comments/5ywoi9/in_2010_gavin_predicted_that_exchanges_ie_the/
fork clients are trying to steal bitcoin's brand and network effect, theyre no different from altcoins https://medium.com/@Coinosphere/why-bitcoin-unlimited-should-be-correctly-classified-as-an-attempted-robbery-of-bitcoin-not-a-9355d075763c#.qeaynlx5m
BU being active makes it easier to reverse payments, increases wasted work making the network less secure and giving an advantage to bigger miners https://www.reddit.com/Bitcoin/comments/5g1x84/bitcoin_unlimited_bu_median_value_of_miner_eb/
bitcoin unlimited takes power away from users and gives it to miners https://medium.com/@alpalpalp/bitcoin-unlimiteds-placebo-controls-6320cbc137d4#.q0dv15gd5
bitcoin unlimited's accepted depth https://twitter.com/tdryja/status/804770009272696832
BU's lying propaganda poster https://imgur.com/osrViDE

BU is bugged, poorly-reviewed and crashes

bitcoin unlimited allegedly funded by kraken stolen coins
Other funding stuff
A serious bug in BU https://www.reddit.com/Bitcoin/comments/5h70s3/bitcoin_unlimited_bu_the_developers_have_realized/
A summary of what's wrong with BU: https://www.reddit.com/Bitcoin/comments/5z3wg2/jihanwu_we_will_switch_the_entire_pool_to/devak98/

Bitcoin Unlimited Remote Exploit Crash 14/3/2017

https://www.reddit.com/Bitcoin/comments/5zdkv3/bitcoin_unlimited_remote_exploit_crash/ https://www.reddit.com/Bitcoin/comments/5zeb76/timbe https://www.reddit.com/btc/comments/5zdrru/peter_todd_bu_remote_crash_dos_wtf_bug_assert0_in/
BU devs calling it as disaster https://twitter.com/SooMartindale/status/841758265188966401 also btc deleted a thread about the exploit https://i.imgur.com/lVvFRqN.png
Summary of incident https://www.reddit.com/Bitcoin/comments/5zf97j/i_was_undecided_now_im_not/
More than 20 exchanges will list BTU as an altcoin
Again a few days later https://www.reddit.com/Bitcoin/comments/60qmkt/bu_is_taking_another_shit_timberrrrr

User Activated Soft Fork (UASF)

site for it, including list of businesses supporting it http://www.uasf.co/
luke's view
threat of UASF makes the miner fall into line in litecoin
UASF delivers the goods for vertcoin
UASF coin is more valuable https://www.reddit.com/Bitcoin/comments/6cgv44/a_uasf_chain_will_be_profoundly_more_valuable/
All the links together in one place https://www.reddit.com/Bitcoin/comments/6dzpew/hi_its_mkwia_again_maintainer_of_uasfbitcoin_on/
p2sh was a uasf https://github.com/bitcoin/bitcoin/blob/v0.6.0/src/main.cpp#L1281-L1283
jgarzik annoyed at the strict timeline that segwit2x has to follow because of bip148 https://twitter.com/jgarzik/status/886605836902162432
Committed intolerant minority https://www.reddit.com/Bitcoin/comments/6d7dyt/a_plea_for_rational_intolerance_extremism_and/
alp on the game theory of the intolerant minority https://medium.com/@alpalpalp/user-activated-soft-forks-and-the-intolerant-minority-a54e57869f57
The risk of UASF is less than the cost of doing nothing https://www.reddit.com/Bitcoin/comments/6bof7a/were_getting_to_the_point_where_a_the_cost_of_not/
uasf delivered the goods for bitcoin, it forced antpool and others to signal (May 2016) https://bitcoinmagazine.com/articles/antpool-will-not-run-segwit-without-block-size-increase-hard-fork-1464028753/ "When asked specifically whether Antpool would run SegWit code without a hard fork increase in the block size also included in a release of Bitcoin Core, Wu responded: “No. It is acceptable that the hard fork code is not activated, but it needs to be included in a ‘release’ of Bitcoin Core. I have made it clear about the definition of ‘release,’ which is not ‘public.’”"
Screenshot of peter rizun capitulating https://twitter.com/chris_belcher_/status/905231603991007232

Fighting off 2x HF

b2x is most of all about firing core https://twitter.com/WhalePanda/status/912664487135760384

Misinformation / sockpuppets

three year old account, only started posting today https://archive.is/3STjH
Why we should not hard fork after the UASF worked: https://www.reddit.com/Bitcoin/comments/6sl1qf/heres_why_we_should_not_hard_fork_in_a_few_months/


Good article that covers virtually all the important history https://bitcoinmagazine.com/articles/long-road-segwit-how-bitcoins-biggest-protocol-upgrade-became-reality/
Interesting post with some history pre-2015 https://btcmanager.com/the-long-history-of-the-fight-over-scaling-bitcoin/
The core scalabality roadmap + my summary from 3/2017 https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-Decembe011865.html my summary https://www.reddit.com/Bitcoin/comments/5xa5fa/the_core_development_scalability_roadmap/
History from summer 2015 https://www.reddit.com/Bitcoin/comments/5xg7f8/the_origins_of_the_blocksize_debate/
Brief reminders of the ETC situation https://www.reddit.com/Bitcoin/comments/6nvlgo/simple_breakdown_of_bip91_its_simply_the_miners/dkcycrz/
Longer writeup of ethereum's TheDAO bailout fraud https://www.reddit.com/ethereumfraud/comments/6bgvqv/faq_what_exactly_is_the_fraud_in_ethereum/
Point that the bigblocker side is only blocking segwit as a hostage https://www.reddit.com/BitcoinMarkets/comments/5sqhcq/daily_discussion_wednesday_february_08_2017/ddi3ctv/?context=3
jonny1000's recall of the history of bitcoin https://www.reddit.com/Bitcoin/comments/6s34gg/rbtc_spreading_misinformation_in_rbitcoinmarkets/dl9wkfx/

Misc (mostly memes)

libbitcoin's Understanding Bitcoin series (another must read, most of it) https://github.com/libbitcoin/libbitcoin/wiki/Understanding-Bitcoin
github commit where satoshi added the block size limit https://www.reddit.com/Bitcoin/comments/63859l/github_commit_where_satoshi_added_the_block_size/
hard fork proposals from some core devs https://bitcoinhardforkresearch.github.io/
blockstream hasnt taken over the entire bitcoin core project https://www.reddit.com/Bitcoin/comments/622bjp/bitcoin_core_blockstream/
blockstream is one of the good guys https://www.reddit.com/Bitcoin/comments/6cttkh/its_happening_blockstream_opens_liquid_sidechain/dhxu4e
Forkers, we're not raising a single byte! Song lyrics by belcher https://gist.github.com/chris-belche7264cd6750a86f8b4a9a
Some stuff here along with that cool photoshopped poster https://medium.com/@jimmysong/bitcoin-realism-or-how-i-learned-to-stop-worrying-and-love-1mb-blocks-c191c35e74cb
Nice graphic https://twitter.com/RNR_0/status/871070843698380800
gmaxwell saying how he is probably responsible for the most privacy tech in bitcoin, while mike hearn screwed up privacy https://www.reddit.com/btc/comments/6azyme/hey_bu_wheres_your_testnet/dhiq3xo/?context=6
Fairly cool propaganda poster https://twitter.com/urbanarson/status/880476631583924225
btc tankman https://i.redd.it/gxjqenzpr27z.png https://twitter.com/DanDarkPill/status/853653168151986177
asicboost discovery meme https://twitter.com/allenscottoshi/status/849888189124947971
gavin wanted to kill the bitcoin chain https://twitter.com/allenscottoshi/status/849888189124947971
stuff that btc believes https://www.reddit.com/Bitcoin/comments/6ld4a5/serious_is_the_rbtc_and_the_bu_crowd_a_joke_how/djszsqu/
after segwit2x NYA got agreed all the fee pressure disappeared, laurenmt found they were artificial spam https://twitter.com/i/moments/885827802775396352
theymos saying why victory isnt inevitable https://www.reddit.com/Bitcoin/comments/6lmpll/explaining_why_big_blocks_are_bad/djvxv2o/
with ignorant enemies like these its no wonder we won https://bitco.in/forum/threads/gold-collapsing-bitcoin-up.16/page-999 ""So, once segwit2x activates, from that moment on it will require a coordinated fork to avoid the up coming "baked in" HF. ""
a positive effect of bcash, it made blockchain utxo spammers move away from bitcoin https://www.reddit.com/btc/comments/76lv0b/cryptograffitiinfo_now_accepts_bitcoin_cash/dof38gw/
summary of craig wright, jihan wu and roger ver's positions https://medium.com/@HjalmarPeters/the-big-blockers-bead6027deb2
Why is bitcoin so strong against attack?!?! (because we're motivated and awesome) https://www.reddit.com/btc/comments/64wo1h/bitcoin_unlimited_is_being_blocked_by_antivirus/dg5n00x/
what happened to #oldjeffgarzik https://www.reddit.com/Bitcoin/comments/6ufv5x/a_reminder_of_some_of_jeff_garziks_greatest/
big blockers fully deserve to lose every last bitcoin they ever had and more https://www.reddit.com/BitcoinMarkets/comments/756nxf/daily_discussion_monday_october_09_2017/do5ihqi/
gavinandresen brainstorming how to kill bitcoin with a 51% in a nasty way https://twitter.com/btcdrak/status/843914877542567937
Roger Ver as bitcoin Judas https://imgur.com/a/Rf1Pi
A bunch of tweets and memes celebrating UASF
https://twitter.com/shaolinfry/status/842457019286188032 | https://twitter.com/SatoshiLite/status/888335092560441345 | https://twitter.com/btcArtGallery/status/887485162925285377 | https://twitter.com/Beautyon_/status/888109901611802624 | https://twitter.com/Excellion/status/889211512966873088 | https://twitter.com/lopp/status/888200452197801984 | https://twitter.com/AlpacaSW/status/886988980524396544 | https://twitter.com/BashCo_/status/877253729531162624 | https://twitter.com/tdryja/status/865212300361379840 | https://twitter.com/Excellion/status/871179040157179904 | https://twitter.com/TraceMayestatus/849856343074902016 | https://twitter.com/TraceMayestatus/841855022640033792 | https://fs.bitcoinmagazine.com/img/images/Screen_Shot_2017-08-18_at_01.36.47.original.png
submitted by belcher_ to Bitcoin [link] [comments]

Is it possible to objectively discuss things without getting dragged to the "Dragons Den"?

The computer science debates about the best ways to scale Bitcoin are far too important for us to take “sides”.
“Beating” the other guy doesn’t serve any of us, and it doesn’t serve Bitcoin.
In these discussions we need to be as sure as possible that we are not just winning an argument, or making talking points but that we are factually correct.
One thing everyone should agree on is the need for truthful and factual statements when discussing these important issues.
Unfortunately, the Bitcoin community learned some very bad habits from the world of politics.
[Disclaimer: I do not work for Blockstream and I never have, just as I did not work for Roger Ver the other day when I defended him against a similar baseless attack about Mt.. Gox https://np.reddit.com/Bitcoin/comments/6846wo/message_to_rogedgvnrqk/ ]
For example, it is difficult to have any discussion in Bitcoin without someone bringing up the “Dragon’s Den”.
"Dragon’s Den" is brought up as an argument point often by everyone from major minors, to CEOs and analysts.
But what do we really know about “Dragon’s Den”?
What is the full body of evidence regarding the claims about Dragons Den?
Well, it’s pretty simple:
1) One lone LN developer claimed that there was a secret channel used for trolling
2) Someone posted a screenshot showing the existence of the slack channel and some people chatting in it
What is absent from this evidence:
So really, we have CEOs, miners and thousands of people who care passionately about Bitcoin using “Dragon’s Den” as a talking point when really – – – the entire thesis about dragons den comes down to “a guy said”.
One doesn’t even have to be critical of the developer who claimed its existence to have doubts about the story.
If you look carefully, Joseph Poon never even claimed to have first-hand knowledge about dragons den being used for trolling – he didn’t even mention that he ever visited the channel.
One guy saying something doesn’t make it true.
One doesn't even have to think that Joseph is a bad actor (and I don't) maybe he was misinformed, didn't think clearly before making his statement, had a misunderstanding or exaggerated.
There was a Reddit post asking him to clarify but it doesn't seem he did.
So what is Dragon’s Den really?
Honestly my best educated guess is that the slack channel simply existed as a place for like-minded individuals to discuss topics and opinions they share an interest in.
After it was revealed to the public, the channel was opened to a number of interested/concerned community members to review including me. This was the first time I knew about or entered the channel. What I saw was pretty similar to the regular Slack. Bias against BU? Of course? Bias in favor of the core roadmap? Sure? An organized trolling campaign? Doubtful.
My guess is that the channel was similar before it became public.
Now it is possible that, as part of an elaborate ruse, participants in the channel have gone and created a new double secret channel where the real trolling is being organized and are still participating in the existing “Dragon’s Den” channel as some sort of theater to throw people off their trail and fool someone like me into posting this.
Maybe, maybe, maybe
But I doubt it. Occams Razor is a good explanation – the most likely explanation is that it simply people who are like-minded gathering together
Just as the most likely explanation for a lot of behavior in Bitcoin is the simple one.
I don't think Gavin is in the CIA or Roger is secretly trying to harm Bitcoin or Blockstream is involved in some conspiracy to destroy Bitcoin with AXA and I don't think Bitmain purposely intended to shut down miners (but the risk was still real) and I don't think Greg Maxwell works for the CIA either.
People have human faults, they mess up, they are self interested etc. Occasionally they will be behind some elaborate ruse, but usually behavior comes down to just people being people.
What about the trolling?
Again absence any evidence I think the simplest explanation is that like-minded people hanging around in the channel are likely to react in similar manners to various news and tweets.
It’s most likely like “Hey did you see this ridiculous argument John Doe just tweeted?" and then a number of like-minded people make comments on that post. Could this be considered organized trolling? Maybe. Same way posting a tweet on one or the other Reddit sub will cause a dozen people to comment...this seems like organized trolling.
Having been attacked, criticized and trolled by many of the same people who are very active in that channel I know how it feels: even half a dozen people can easily make you feel overwhelmed by attackers. Between misconceptions, logical fallacies, name-calling, retweets, reposts and comments for multiple people you can feel like Boromir in the first Lord of the Rings movie with arrow after arrow shot at you.
But really – this is just part of the way the Internet works.
I once even found myself banned from the entire Slack in question. I annoyed people with my continual calls for compromise as well as occasionally defending people from exactly the type of unfair or inaccurate attack I’m talking about here.
Fortunately, cooler heads prevailed, I put my differences aside with the people at in the channel and since then have had a lot of productive and mutually respectful discussions about this important technology.
But core runs this!
This is the thing, they really don’t. I tried to get this point across in a recent blog post.
Down With Bitcoin Core (as a noun used to describe people)
I know it’s frustrating, and I used to feel the same way as many BU supporters in thinking that “core” was one monolithic, like-minded entity. ** It’s really not. It's just not factual to look at it this way**
*Only after a lot of time spending a lot of time meeting with and discussing issues with numerous core and non-core developers did I come to this opinion. *
I now think that it is only fair for us to judge individual people on the individual actions that they take. Not actions taken by a group.
Worse yet: Some core developers even use this word “core” as a noun to describe people when it suits them. This is equally wrong. Individual people and only individual people should be held accountable for the actions that they actually taken the words that they actually say.
If we do that, it’s harder to argue.
Groups are easy to hate. It’s very easy to discuss to prescribe opinions or characteristics on “Republicans” or “liberals” or "core" or "btc". If you are running for office, it’s a great idea to use these types of terms: they seek to divide and drive wedges.
The world of politics is not that simple: on major issues ranging from the drug war to war, foreign policy and taxes there are vast differences between people who carry the various political labels.
An open source project, particularly one as diverse as Bitcoin, has much more nuance than this even.
There is definitely no universal “core” opinion.
“Core developers” include a wide variety of people: Gavin Andresen, Satoshi Nakamoto, Greg Maxwell, Matt Corallo, Alex Marcos, Peter Todd, Vladimir and many others.
It just isn’t accurate or fair to put all people in the same category.
What happens if we do start holding individual people accountable for their actions?
Well, for one thing, it makes it a lot harder to argue against broad ideas such as “core believes X” or “core failed to X”.
If someone claims that the “Dragon’s Den” is some sort of effort by “core”, then the first question should be “Who specifically do you mean by “core””? As far as I can tell the only contributor to the core software project was active in that particular channel is the moderator.
In fact, it does not seem likely that even more than a couple other actual core developers ever even visited the channel – – let alone used it or engaged in any sort of organized trolling behavior from it. How many core devs total on the high end even visited? Five?
So instead of saying "Dragons Den is a core project to organize trolling" why don't we say "these five people organize trolling"? Well, because it's too damn hard. It's EASY to blame some nameless faceless group, but when you name specific people you usually have to back it up better, so those five people accused would (rightfully) respond "Huh? What evidence do you have of this?"
Bram Cohen was caught in this exact situation. Just by being in this now infamous chat channel it was assumed he was a participant in trolling. Why? Because a guy said that the channel exists for trolling
If someone has visions of Vladimir or Greg Maxwell sitting around this particular Slack channel planning troll campaigns – – the facts and evidence simply don’t show this to be reflective of reality.
So what should we do next?
This argument isn’t going to be solved by one post.
But what we can all do is work together to discuss things in his fair and accurate of a way as possible.
This means holding actual people responsible for actual actions they do. If you don't like the core roadmap, debate it with Nullc, if you don't agree with the people who signed it, take it up with them. If you don't think the peer review process is fair and objective, contribute technically to the discussion. If you don't like the "Dragons Den" then take it up with specific people. What this really ends up looking like is that instead of being able to say "Dragons Den is a massive troll army run by core" it ends up more like "BTCDrak and MrHodl and Alp are in a channel and I don't like what they tweeted". That second statement doesn't have the bite of the first....but it's true.
Working for better standards also means we should really avoid claims about either "side" unless they are backed by evidence and relevant to the discussion.
In fact, we shouldn’t even be having “sides” at all.
We are here to change the world. No one likes the bickering. Many don't participate...but almost all of us support it at some point. Whether it's up-voting a divisive post, sharing a meme attacking the other guys or using terms designed to place people in camps, we contribute even when we don't mean to.
What would happen if we all decided to no longer participate in division? What if we down voted every comment that attacks individuals or which is divisive and we upvoted everything positive?
What if we kept scientific debate more scientific?
Every piece of data and information in this discussion should be analyzed independently. Independent of the source ended up dependent of what our own motivations or narrative might be.
Anytime we engage in using terms designed to “beat” the other side red division, we don’t win paragraph we win and Bitcoin wins by us all working together to be as accurate and fair as possible.
submitted by bruce_fenton to btc [link] [comments]

Creator of Bittorrent Thinks He Can Kill Bitcoin With Chia, a Burstcoin (BURST) Copycat

Creator of Bittorrent Thinks He Can Kill Bitcoin With Chia, a Burstcoin (BURST) Copycat

Bram Cohen, the creator of the popular peer to peer torrenting software BitTorrent, is creating a cryptocurrency called Chia. This new cryptocurrency will use Proof of Capacity (PoC), an algorithm perfected by Burstcoin (BURST).
Breaker incorrectly reports that PoC, which Cohen is calling Proof of Space, is a new algorithm developed by Cohen. Further, Breaker writes that Chia could kill Bitcoin. It is unlikely that Chia will even be competitive with Burstcoin(BURST), which has a $9 million market cap, let alone Bitcoin, as we’ll now explain.
The crypto space is unfortunately filled with copycat devs who make copycat cryptocurrencies, and this Chia situation appears to be a good example. Burstcoin(BURST)has a strong community of Cypherpunks and has been using PoC since 2014. If Cohen truly cared about the adoption of PoC crypto, he should have jumped on the Burstcoin(BURST) boat instead of trying to create a different cryptocurrency that would compete with it.
Chia is advertised as being a “green cryptocurrency,” similar to how stores often have an organic foods section. Green goods and services are big money in the current economy because slapping a green brand on something means environmental and health enthusiasts automatically flock to it. The greenness of Chia is entirely derived from PoC, which Burstcoin (BURST) has already been doing since 2014. Perhaps the Burstcoin (BURST) community should one-up Cohen and do a green advertising campaign before Chia launches since the method of branding is the only advantage that Chia has.
Indeed, PoC is environmentally friendly and uses practically no electricity, which also makes Burstcoin (BURST) one of the only profitable cryptocurrencies to mine on personal computers. PoC cryptocurrencies read cryptographic hashes from a plot in a hard drive, rather than calculating new cryptographic hashes all the time like with Proof of Work (PoW). The miner who finds the answer the quickest in their hard drive gets the block reward. Since more hard drive space equals more answers, more hard drive space leads to more block rewards. The Burstcoin(BURST) mining network has 230,000 TB (230 PB) of hard drive space at this time, driven by a strong community. It seems unlikely Chia would ever exceed that number since people interested in PoC already mine Burstcoin (BURST), and would not abandon Burstcoin (BURST) for a copycat.
Cohen incorrectly argues that miners would not continuously expand their mining operations with PoC.
“The idea is that you’re leveraging this resource of storage capacity, and people already have ludicrous amounts of excess storage on their laptops, and other places, which is just not being utilized,” he said. “There is so much of that already that it should eventually reach the point where if you were buying new hard drives for the purpose of farming, it would lose you money”.
The fact is that serious Burstcoin(BURST) miners regularly buy petabytesPB of new hard drive space to maximize revenue, and Chia miners would act no differently if Chia catches on.
The one possible difference between Burstcoin (BURST) and Chia is that Cohen is trying to prevent a “re-mining from genesis” attack, where a miner could create an entirely new chain starting at the genesis block. If they had enough hard drive power to do this, perhaps they would fork the blockchain. To avoid this attack Cohen says Chia will also integrate “Proof of Time” (PoT).
First off, Burstcoin (BURST) has never had issues with this sort of attack in its five years of existence. If someone has a tremendous amount of hard drive space and does PoC mining, it would be absolutely senseless to do this sort of attack since their entire mining farm would become worthless.
If the point of the attack was to do a double spend, the coins gained in the double spend would become worthless too. Cohen is trying to prevent an attack that is extremely unlikely since attackers have no incentive to do it.
The details about PoT are vague, with it being a parallel process to mining that takes the same amount of time no matter how much hard drive space is used. This would make Chia less efficient than Burstcoin (BURST) and therefore less competitive. Cohen is offering $100,000 to anyone who can develop PoT, making it clear it does not exist yet. Since PoT would probably make Chia less efficient, and it has not been developed yet, and it solves a non-existent problem, the PoT acronym is appropriate.
Ultimately, Chia is branded as green money for a digital world, when the reality is Burstcoin (BURST) already is green money for a digital world. Burstcoin (BURST) has a strong reputation, has been continuously running for five 5 years, and has an extremely strong community. It seems unlikely Chia will ever become more popular than Burstcoin (BURST), and unlike the title of the Breaker article indicates, Chia certainly has no chance of killing Bitcoin.
submitted by turtlecane to burstcoin [link] [comments]

POWA - With hybrid PoW you could have 2 algos: an ASIC friendly one that counters botnets and a CPU friendly one that allows decentralized mining. Nether alone is even capable of a 51% attack.

Damn... think about it: this is the key for decentraliced mining:
With hybrid PoW you could have 2 algos: an ASIC friendly one that counters botnets and a CPU friendly one that allows decentralized mining. Nether alone is even capable of a 51% attack.
blackmarble: https://www.reddit.com/Bitcoin/comments/60j1zi/bram_cohen_bittorrents_creator_a_soft_fork_change/df70mn6/
And as far as I know, it can be deployed as a softfork (miners not needed for acivation, compatible with older nodes)!
luke-jr petertodd nullc - maybe you would like to comment? :D
submitted by BitcoinReminder_com to Bitcoin [link] [comments]

Who wrote the "A Call for Consensus"

17 hours ago a mysterious redditor posted the following: "Adam Back is trying to get miners to sign a letter to never run Classic"
up until 17 hours ago, nobody had any even heard of a letter being shopped around to get miners to not run Classic. this person, whoever you are, clearly knew inside information. they even know the signers on the list and the content of the letter. I found this person's post after the actual letter being published to be 100% credible.
then out of nowhere, the letter starts circulating. first in chinese forums by YourBTCC (BTC China)
these letters were posted BEFORE the letter posted to medium. they also do NOT include some of the signers from the medium post, indicating the medium post came after with more signers.
its extremely obvious that the letter has come from the blockstream bitcoin core side, i mean duh right. but who wrote it? well we already have the redditor above telling us it was adam back. but who else, greg maxwell or core poster boy btcdrak? somebody else?
in looking at the anonymous written letter, there are several unique words and sayings in it.
being that btcdrak is the honorary poster boy and known scammer, i started with him. in looking at his comment history, you can see that he always writes hard fork as "hard fork" and NOT "hard-fork". examples:
there are many other examples .sometimes he writes "hardfork" too but he almost never writes "hard-fork" that I can find....
on a side note - I did find this in btcdrak's github which i found to be very strange.
btcdrak added laanwj to viacoin/viacoin 22 hours ago http://i.imgur.com/mLCPHNL.png
btcdrak was the first one to share the letter in the bitcoin-dev IRC https://bitcoinstats.com/irc/bitcoin-dev/logs/2016/02/11#l1455174010.0
as it appears drak is instrumental in this, he doesn't appear to have written it as his stylistics are different.
looking a greg maxwells style of writing he does write "hard-fork" often. examples.
plus many more examples. but this alone doesn't seem to be enough. there has to be more proof of who wrote it. note also that maxwell also writes "hark forks" too, not just with a hypen. plus maxwell doesn't use hyphens inbetween words often, unless its saying hard or soft forks mainly.
we know the original redditor said it was adam back, his credibility is 100%, so let's look at adam's writing style....
whoa "hard-forks" are everywhere.
plus many more. there are a few instances where adam writes "hard fork" without the hypen. but there are many examples where he quotes others who write "hard fork" and he explicitly fixes his statement to respond with "hard-fork". here is a example https://www.reddit.com/Bitcoin/comments/438hx0/a_trip_to_the_moon_requires_a_rocket_with/czgmmfp
also, it seems adam LOVES hyphens!
"seg-wit" "hard-fork" "soft-fork" "alt-coin" "segregated-witness" "counter-intuitive" "opt-in" "extension-blocks" "non-upgraded" plus more!
i mean he just loves to write with hyphens. then check this out, even on twitter adam back writes "hard-forks" a lot! examples
plus more!
here we have a mysterious redditor giving us 100% unreleased news of a letter written by adam back. then through writing style analysis we see that adam back is the most likely candidate through process of elimination from the blockstream team. and also just to solidify the claim that adam back did in fact shop around this letter and actually write it, let's go back to who first were the ones to publish it.
BTC China!
who works for btc china, SAMSON MOW who is the COO. he is also a signer on the medium post. and what happened just 2 days ago, MOW wrote against bitcoin classic on the bitcoin dev mailing list here https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2016-February/012412.html
and it was posted to reddit here https://www.reddit.com/btc/comments/44vosl/samson_mow_of_btcc_says_most_companies_listed_on/
and who was all up in the thread with btc china
I think we have the person who wrote the letter, but i would also conclude that maxwell helped him write it and others proofread it such as core wannabe drak.
team effort guys!
submitted by Gobitcoin to btc [link] [comments]

Creator of Bittorrent Thinks He Can Kill Bitcoin With Chia, a Burstcoin (BURST) Copycat

Creator of Bittorrent Thinks He Can Kill Bitcoin With Chia, a Burstcoin (BURST) Copycat

Bram Cohen, the creator of the popular peer to peer torrenting software BitTorrent, is creating a cryptocurrency called Chia. This new cryptocurrency will use Proof of Capacity (PoC), an algorithm perfected by Burstcoin (BURST).
Breaker incorrectly reports that PoC, which Cohen is calling Proof of Space, is a new algorithm developed by Cohen. Further, Breaker writes that Chia could kill Bitcoin. It is unlikely that Chia will even be competitive with Burstcoin(BURST), which has a $9 million market cap, let alone Bitcoin, as we’ll now explain.
The crypto space is unfortunately filled with copycat devs who make copycat cryptocurrencies, and this Chia situation appears to be a good example. Burstcoin(BURST)has a strong community of Cypherpunks and has been using PoC since 2014. If Cohen truly cared about the adoption of PoC crypto, he should have jumped on the Burstcoin(BURST) boat instead of trying to create a different cryptocurrency that would compete with it.
Chia is advertised as being a “green cryptocurrency,” similar to how stores often have an organic foods section. Green goods and services are big money in the current economy because slapping a green brand on something means environmental and health enthusiasts automatically flock to it. The greenness of Chia is entirely derived from PoC, which Burstcoin (BURST) has already been doing since 2014. Perhaps the Burstcoin (BURST) community should one-up Cohen and do a green advertising campaign before Chia launches since the method of branding is the only advantage that Chia has.
Indeed, PoC is environmentally friendly and uses practically no electricity, which also makes Burstcoin (BURST) one of the only profitable cryptocurrencies to mine on personal computers. PoC cryptocurrencies read cryptographic hashes from a plot in a hard drive, rather than calculating new cryptographic hashes all the time like with Proof of Work (PoW). The miner who finds the answer the quickest in their hard drive gets the block reward. Since more hard drive space equals more answers, more hard drive space leads to more block rewards. The Burstcoin(BURST) mining network has 230,000 TB (230 PB) of hard drive space at this time, driven by a strong community. It seems unlikely Chia would ever exceed that number since people interested in PoC already mine Burstcoin (BURST), and would not abandon Burstcoin (BURST) for a copycat.
Cohen incorrectly argues that miners would not continuously expand their mining operations with PoC.
“The idea is that you’re leveraging this resource of storage capacity, and people already have ludicrous amounts of excess storage on their laptops, and other places, which is just not being utilized,” he said. “There is so much of that already that it should eventually reach the point where if you were buying new hard drives for the purpose of farming, it would lose you money”.
The fact is that serious Burstcoin(BURST) miners regularly buy petabytesPB of new hard drive space to maximize revenue, and Chia miners would act no differently if Chia catches on.
The one possible difference between Burstcoin (BURST) and Chia is that Cohen is trying to prevent a “re-mining from genesis” attack, where a miner could create an entirely new chain starting at the genesis block. If they had enough hard drive power to do this, perhaps they would fork the blockchain. To avoid this attack Cohen says Chia will also integrate “Proof of Time” (PoT).
First off, Burstcoin (BURST) has never had issues with this sort of attack in its five years of existence. If someone has a tremendous amount of hard drive space and does PoC mining, it would be absolutely senseless to do this sort of attack since their entire mining farm would become worthless.
If the point of the attack was to do a double spend, the coins gained in the double spend would become worthless too. Cohen is trying to prevent an attack that is extremely unlikely since attackers have no incentive to do it.
The details about PoT are vague, with it being a parallel process to mining that takes the same amount of time no matter how much hard drive space is used. This would make Chia less efficient than Burstcoin (BURST) and therefore less competitive. Cohen is offering $100,000 to anyone who can develop PoT, making it clear it does not exist yet. Since PoT would probably make Chia less efficient, and it has not been developed yet, and it solves a non-existent problem, the PoT acronym is appropriate.
Ultimately, Chia is branded as green money for a digital world, when the reality is Burstcoin (BURST) already is green money for a digital world. Burstcoin (BURST) has a strong reputation, has been continuously running for five 5 years, and has an extremely strong community. It seems unlikely Chia will ever become more popular than Burstcoin (BURST), and unlike the title of the Breaker article indicates, Chia certainly has no chance of killing Bitcoin.
submitted by turtlecane to CryptoCurrency [link] [comments]

BitTorrent inventor Bram Cohen on medium.com argues *against* a "simplistic plan" for scaling Bitcoin with "popular support" among "people who don't know any better" and want a "simple fix". He favors "people doing actual development who aren’t particularly good at talking". Here's why he's wrong.

Sorry Bram, but part of "real engineering work" often involves actually interacting with real users to solve their real problems, as quickly and as simply as possible (or as you prefer to dismissively put it in your essay: "people who don’t know any better" who are looking for a "simple fix").
This is why Bitcoin Classic is rapidly gaining consensus among major Bitcoin stakeholders, who are rejecting the needlessly slow & complicated roadmap from Core / Blockstream devs - who, as you yourself admit in your essay, "aren’t particularly good at talking" (or listening, for that matter).
Experience on successful real projects in the real world has shown us (with Satoshi's initial release of Bitcoin being a case in point) that the fastest, simplest and most popular solutions are actually often the best.
In the above essay, Bram Cohen, inventor of BitTorrent, makes the following arguments:
Mike Hearn, Jeff Garzik, and Gavin Andresen ... are doing a good job of whipping up popular support ...
They have a simplistic plan which appeals to people who don’t know any better or want to be told that technical problems can be made to magically go away with a simple fix.
On the other side are the people doing actual development, who aren’t particularly good at talking to the press or whipping up support on reddit and have a plan which requires real engineering work moving forwards.
There are several things seriously wrong with the Bram Cohen's central argument above:
(1) The first part of his statement above is obsolete and hence irrelevant.
Mike and Gavin did indeed previously support BIP 101 (smoothly scaling from 8 MB to 8 GB max blocksize by doubling every 2 years for 20 years) - but in the past week, things have changed dramatically, and the community has moved on:
  • Mike is gone, and it's become clear that support for BIP 101 / XT has dried up;
  • Gavin and Jeff support Bitcoin Classic, which is not BIP 101.
So Bram's comparison of Core's current roadmap with a deprecated roadmap (BIP 101) is now irrelevant.
All the buzz is around a recent new competing repo: Bitcoin Classic.
(2) The second part of Bram's statement above is wrong because it is precisely the simplicity and "appealingness" of Bitcoin Classic which are its strengths.
He dismisses those factors as if they were bad things - but they're actually good things.
The main reason for the past year of impasse is that all previously proposed solutions weren't simple and appealing enough to gain any actual consensus (among the actual users themselves - I don't mean among the devs at a single, out-of-touch and ultimately replaceable team: Core / Blockstream).
Bitcoin Classic's only initial change is to do an immediate bump to merely 2 MB - while also providing, long-term, a more democratic, transparent means of governance - based not on Core / Blockstream devs ACKing and NACKing pull-requests on the GitHub repo - but rather on a much more inclusive and deliberative multi-phase process.
The fact of being simple and inclusive (which Bram erroneously dismisses as being "simplistic" and "popular" by which he presumably means "populist") is precisely why Bitcoin Classic has been rapidly gaining consensus among all stakeholders in the Bitcoin community: miners, users, devs and businesses:
Bram can talk all he wants on medium.com about what might have been, and about how his favorite dev team knows better than actual users (who he insultingly dismisses as "people who don't know any better").
But figuring out how to safely and quickly and simply scale Bitcoin (which is the main issue right now) might not be the exclusive province of C/C++ devs who code in isolation all day.
In fact, as we are now seeing, it turns out that there are other stakeholders in the Bitcoin space who might actually have better ideas on how to do this kind of scaling.
So it's wrong (as well as being elitist) for Bram to dismissively insult such stakeholders as "people who don't know any better" - particularly because in many cases, what we're actually talking about here are major companies with annual revenues in the millions of dollars, with qualified dev teams of their own.
To take just one obvious example: look at Coinbase. They were banned from /Bitcoin and bitcoin.org by Theymos for daring to announce that they were testing XT - in order to serve better serve their users under all possible scenarios in the future.
Coinbase, as we know, also happens to be one of the major on-ramps for many new Bitcoin users, since they're a major US-registered financial institution.
And Coinbase also happens to have the technical and engineering expertise to have written their own open-source fully-validating Bitcoin node from scratch based on Ruby and PostgreSQL.
This is kind of Bitcoin stakeholders that Bram is insulting and dismissing when he talks about "people who don't know any better": a company which basically produced a clone of the full-node part of Core. And note that Coinbase wrote this from scratch in different langauges (Ruby and PostgreSQL), instead of inheriting (some would say "hijacking") Satoshi's orignal C/C++ codebase.
So Bram is simply being rude and mean when he dismisses a major company like Coinbase as being merely "people who don't know any better". Bitcoin expertise is not confined to Core / Blockstream devs.
In fact, there is new breed of Bitcoin experts emerging now - people who know more about the challenges Bitcoin faces today (eg, scaling and network topology) rather than the challenges Bitcoin faced in the past (eg, hasing and crypto).
Two names are worth mentioning among this new wave of experts:
  • Dr Peter R. Rizun - who has also joined Bitcoin Classic now - and who has been terribly maligned and censored by Core / Blockstream:
Dr Peter R. Rizun, managing editor of the first peer-reviewed cryptocurrency journal, is an important Bitcoin researcher. He has also been attacked and censored for months by Core / Blockstream / Theymos. Now, he has now been suspended (from all subreddits) by some Reddit admin(s). Why?
  • Cornell researcher Emin Gün Sirer
Miners produce a generic COMMODITY: transactions included in blocks on the chain. If certain miners refuse to produce ENOUGH of this commodity, then they CAN and WILL be REPLACED. (Important reminders from Cornell researcher Emin Gün Sirer)
Look, I really like the stuff that Pieter Wuille is doing with SegWit - and I also really like the stuff that Greg Maxwell could contribute with Confidential Transactions (but please just ignore the few posters in this search-link who worry that CT is "dangerous" because quantum computing might come along someday.). (Although I think that any such major upgrades should be done as a hard-fork, which is more explicit and thus safer than a soft-fork).
So there is room for many types of devs in Bitcoin, and there is exciting work to be done long-term.
But Bram's essay is really about scaling now. And Core / Blockstream has not provided any solutions available now, nor have they researched what users really want and need now.
Thus it's understandable that users are gravitating towards a new dev team which can deliver a "simple fix" - in this case, Bitcoin Classic. And that's normal and healthy.
(3) Finally, there's plenty of owners of major multi-million-dollar mining operations who Bram also dismisses as "people who don’t know any better", people who believe in "magic" or a "simple fix".
At the same time, Bram inexplicably praises a bunch of devs who - as he himself admits - "aren't particularly good at talking" or "whipping up support" - while ignoring the fact that it is is precisely this lack of communication skills which got us into this whole mess. Core / Blockstream are screwing up the short-term and long-term project management of Bitcoin, because they have shown that they are totally incapable of coming up with a realistic roadmap which the community could actually support. (They may have their own reasons for the strange way they prioritized their roadmap, but we don't really know - there's lots of theories out there.)
On the other hand, the people behind Bitcoin Classic (not mentioned by Bram here, as he focuses instead on the obsolete strawman of Mike Hearn / BIP 101), have proven themselves to be "particularly good at talking" (and more important: listening) to actual users and major businesses, in order figure out a a safe, reasonable and practical "simple fix" to satisfy users' needs and requirements now.
Specifically, jtoomim (founder of Bitcoin Classic) has done extensive research, interacting with miners all over the world - on both sides of the Great Firewall of China.
As it turns out (and as stated by Gavin, another lead dev on Bitcoin Classic) the Great Firewall of China, and the concentration of so much mining on the "other" side of it, is one of the main obstacles to simple "blocksize-based" scaling solutions.
So Gavin previously experimented with 20 MB blocks, and more recently jtoomim experimented with 2-3 MB - in the field - producing empirical evidence that 2-3 MB blocks are feasible and acceptable to miners now.
This is the very definition of a "simple fix", with massive "support" from the people who matter: the miners themselves.
And this kind of research with users in the field is exactly what Bitcoin needs now - despite the fact that it might not a sexy enough engineering-based solution to satisfy Bram Cohen and the out-of-touch devs at Core / Blockstream, who have proven themselves time and time again to be unable and/or unwilling to do deliver a simple, popular scaling solution.
So by isolating themselves in their bubble of censorship to focus on elegant engineering, and avoiding the messy public forums where open debate actually occurs - and openly scorning their users (Greg Maxwell calling /btc a "cesspool" and more recently supporting Luke-Jr's attempt to sabotage Bitcoin Classic by injecting a poison-pill pull request to change the PoW and kick all miners off the network, Peter Todd releasing RBF over massive protests and recently doing a gray-hat double-spend against major US-registered Bitcoin financial processor Coinbase) - Core / Blockstream have shown themselves to be arrogant and out of touch, and have alienated the Bitcoin community by being willing jeopardize the network as they chant their mantra that "there's no emergency yet".
This is people are rejecting Core / Blockstream's so-called "scaling" roadmap (which unfortunately includes no "simple fix" - ie a minimal blocksize-based solution acceptable to the community - and instead relies on complicated, untested, fancy code such as SegWit and LN - which be might good later but which aren't ready now).
It's too little and too late, too slow and too complicated (and possibly vaporware).
Instead, people want the simpler, faster and field-tested solutions researched and developed by the devs over at the new repo: Bitcoin Classic.
Bram Cohen is needlessly focusing in his essay on what used-to-be and what might-have-been and what could-be-someday.
Meanwhile the researchers and developers at Bitcoin Classic, like Gavin and JToomim, have been focusing on the here-and-now.
In this sense, the Bitcoin Classic researchers and developers are closer to Satoshi, with his preference for practical solutions which work "good enough" to be implemented now, instead of "perfect" solutions which are so complicated that they might never get implemented at all.
Also recall that several major Core / Blockstream devs didn't believe Bitcoin would work:
  • Gregory Maxwell "mathematically proved" that Bitcoin would be "impossible" (ignoring a little thing like "complexity" - which shows that he might not be that well-rounded, since many mathematicians are indeed familiar with "complexity theory", involving termination, NP, and all that fun stuff).
  • Adam Back missed out on being an earlier adopter of Bitcoin even when tipped off by Satoshi (Adam had invented an earlier prototype called HashCash, but in his case he ignored how inflation might work - which shows that he also might not be that well-rounded, since many economists in the real world do indeed know how currency inflation works).
  • Peter Todd is an odd case, focusing on breaking things that aren't broken in order to petulantly prove a point (so he might be good in Testing or Threat Assessment, but he's probably not the kind of guy you want in Project Management).
These are the kinds of people Bram is arguing we should to support - people whose track record of being right on Bitcoin has been spotty at best, often because they're more interested in spending ages solving complicated engineering problems rather than in providing "simple fixes" for real users in the real world.
Meanwhile, guys like Gavin, JGarzik, and JToomim - all of whom are involved with Bitcoin Classic - are operating more in the spirit of Satoshi - they've been working closely with real users in the real world, figuring out what they really need and want and getting ready to actually deliver it, soon - which is why consensus among users, miners, devs and businesses has been rapidly coalescing around the new competing repo Bitcoin Classic.
submitted by ydtm to btc [link] [comments]

Is anyone else freaked out by this whole blocksize debate? Does anyone else find themself often agreeing with *both* sides - depending on whichever argument you happen to be reading at the moment? And do we need some better algorithms and data structures?

Why do both sides of the debate seem “right” to me?
I know, I know, a healthy debate is healthy and all - and maybe I'm just not used to the tumult and jostling which would be inevitable in a real live open major debate about something as vital as Bitcoin.
And I really do agree with the starry-eyed idealists who say Bitcoin is vital. Imperfect as it may be, it certainly does seem to represent the first real chance we've had in the past few hundred years to try to steer our civilization and our planet away from the dead-ends and disasters which our government-issued debt-based currencies keep dragging us into.
But this particular debate, about the blocksize, doesn't seem to be getting resolved at all.
Pretty much every time I read one of the long-form major arguments contributed by Bitcoin "thinkers" who I've come to respect over the past few years, this weird thing happens: I usually end up finding myself nodding my head and agreeing with whatever particular piece I'm reading!
But that should be impossible - because a lot of these people vehemently disagree!
So how can both sides sound so convincing to me, simply depending on whichever piece I currently happen to be reading?
Does anyone else feel this way? Or am I just a gullible idiot?
Just Do It?
When you first look at it or hear about it, increasing the size seems almost like a no-brainer: The "big-block" supporters say just increase the blocksize to 20 MB or 8 MB, or do some kind of scheduled or calculated regular increment which tries to take into account the capabilities of the infrastructure and the needs of the users. We do have the bandwidth and the memory to at least increase the blocksize now, they say - and we're probably gonna continue to have more bandwidth and memory in order to be able to keep increasing the blocksize for another couple decades - pretty much like everything else computer-based we've seen over the years (some of this stuff is called by names such as "Moore's Law").
On the other hand, whenever the "small-block" supporters warn about the utter catastrophe that a failed hard-fork would mean, I get totally freaked by their possible doomsday scenarios, which seem totally plausible and terrifying - so I end up feeling that the only way I'd want to go with a hard-fork would be if there was some pre-agreed "triggering" mechanism where the fork itself would only actually "switch on" and take effect provided that some "supermajority" of the network (of who? the miners? the full nodes?) had signaled (presumably via some kind of totally reliable p2p trustless software-based voting system?) that they do indeed "pre-agree" to actually adopt the pre-scheduled fork (and thereby avoid any possibility whatsoever of the precious blockchain somehow tragically splitting into two and pretty much killing this cryptocurrency off in its infancy).
So in this "conservative" scenario, I'm talking about wanting at least 95% pre-adoption agreement - not the mere 75% which I recall some proposals call for, which seems like it could easily lead to a 75/25 blockchain split.
But this time, with this long drawn-out blocksize debate, the core devs, and several other important voices who have become prominent opinion shapers over the past few years, can't seem to come to any real agreement on this.
Weird split among the devs
As far as I can see, there's this weird split: Gavin and Mike seem to be the only people among the devs who really want a major blocksize increase - and all the other devs seem to be vehemently against them.
But then on the other hand, the users seem to be overwhelmingly in favor of a major increase.
And there are meta-questions about governance, about about why this didn't come out as a BIP, and what the availability of Bitcoin XT means.
And today or yesterday there was this really cool big-blockian exponential graph based on doubling the blocksize every two years for twenty years, reminding us of the pure mathematical fact that 210 is indeed about 1000 - but not really addressing any of the game-theoretic points raised by the small-blockians. So a lot of the users seem to like it, but when so few devs say anything positive about it, I worry: is this just yet more exponential chart porn?
On the one hand, Gavin's and Mike's blocksize increase proposal initially seemed like a no-brainer to me.
And on the other hand, all the other devs seem to be against them. Which is weird - not what I'd initially expected at all (but maybe I'm just a fool who's seduced by exponential chart porn?).
Look, I don't mean to be rude to any of the core devs, and I don't want to come off like someone wearing a tinfoil hat - but it has to cross people's minds that the powers that be (the Fed and the other central banks and the governments that use their debt-issued money to run this world into a ditch) could very well be much more scared shitless than they're letting on. If we assume that the powers that be are using their usual playbook and tactics, then it could be worth looking at the book "Confessions of an Economic Hitman" by John Perkins, to get an idea of how they might try to attack Bitcoin. So, what I'm saying is, they do have a track record of sending in "experts" to try to derail projects and keep everyone enslaved to the Creature from Jekyll Island. I'm just saying. So, without getting ad hominem - let's just make sure that our ideas can really stand scrutiny on their own - as Nick Szabo says, we need to make sure there is "more computer science, less noise" in this debate.
When Gavin Andresen first came out with the 20 MB thing - I sat back and tried to imagine if I could download 20 MB in 10 minutes (which seems to be one of the basic mathematical and technological constraints here - right?)
I figured, "Yeah, I could download that" - even with my crappy internet connection.
And I guess the telecoms might be nice enough to continue to double our bandwidth every two years for the next couple decades – if we ask them politely?
On the other hand - I think we should be careful about entrusting the financial freedom of the world into the greedy hands of the telecoms companies - given all their shady shenanigans over the past few years in many countries. After decades of the MPAA and the FBI trying to chip away at BitTorrent, lately PirateBay has been hard to access. I would say it's quite likely that certain persons at institutions like JPMorgan and Goldman Sachs and the Fed might be very, very motivated to see Bitcoin fail - so we shouldn't be too sure about scaling plans which depend on the willingness of companies Verizon and AT&T to double our bandwith every two years.
Maybe the real important hardware buildout challenge for a company like 21 (and its allies such as Qualcomm) to take on now would not be "a miner in every toaster" but rather "Google Fiber Download and Upload Speeds in every Country, including China".
I think I've read all the major stuff on the blocksize debate from Gavin Andresen, Mike Hearn, Greg Maxwell, Peter Todd, Adam Back, and Jeff Garzick and several other major contributors - and, oddly enough, all their arguments seem reasonable - heck even Luke-Jr seems reasonable to me on the blocksize debate, and I always thought he was a whackjob overly influenced by superstition and numerology - and now today I'm reading the article by Bram Cohen - the inventor of BitTorrent - and I find myself agreeing with him too!
I say to myself: What's going on with me? How can I possibly agree with all of these guys, if they all have such vehemently opposing viewpoints?
I mean, think back to the glory days of a couple of years ago, when all we were hearing was how this amazing unprecedented grassroots innovation called Bitcoin was going to benefit everyone from all walks of life, all around the world:
...basically the entire human race transacting everything into the blockchain.
(Although let me say that I think that people's focus on ideas like driverless cabs creating realtime fare markets based on supply and demand seems to be setting our sights a bit low as far as Bitcoin's abilities to correct the financial world's capital-misallocation problems which seem to have been made possible by infinite debt-based fiat. I would have hoped that a Bitcoin-based economy would solve much more noble, much more urgent capital-allocation problems than driverless taxicabs creating fare markets or refrigerators ordering milk on the internet of things. I was thinking more along the lines that Bitcoin would finally strangle dead-end debt-based deadly-toxic energy industries like fossil fuels and let profitable clean energy industries like Thorium LFTRs take over - but that's another topic. :=)
Paradoxes in the blocksize debate
Let me summarize the major paradoxes I see here:
(1) Regarding the people (the majority of the core devs) who are against a blocksize increase: Well, the small-blocks arguments do seem kinda weird, and certainly not very "populist", in the sense that: When on earth have end-users ever heard of a computer technology whose capacity didn't grow pretty much exponentially year-on-year? All the cool new technology we've had - from hard drives to RAM to bandwidth - started out pathetically tiny and grew to unimaginably huge over the past few decades - and all our software has in turn gotten massively powerful and big and complex (sometimes bloated) to take advantage of the enormous new capacity available.
But now suddenly, for the first time in the history of technology, we seem to have a majority of the devs, on a major p2p project - saying: "Let's not scale the system up. It could be dangerous. It might break the whole system (if the hard-fork fails)."
I don't know, maybe I'm missing something here, maybe someone else could enlighten me, but I don't think I've ever seen this sort of thing happen in the last few decades of the history of technology - devs arguing against scaling up p2p technology to take advantage of expected growth in infrastructure capacity.
(2) But... on the other hand... the dire warnings of the small-blockians about what could happen if a hard-fork were to fail - wow, they do seem really dire! And these guys are pretty much all heavyweight, experienced programmers and/or game theorists and/or p2p open-source project managers.
I must say, that nearly all of the long-form arguments I've read - as well as many, many of the shorter comments I've read from many users in the threads, whose names I at least have come to more-or-less recognize over the past few months and years on reddit and bitcointalk - have been amazingly impressive in their ability to analyze all aspects of the lifecycle and management of open-source software projects, bringing up lots of serious points which I could never have come up with, and which seem to come from long experience with programming and project management - as well as dealing with economics and human nature (eg, greed - the game-theory stuff).
So a lot of really smart and experienced people with major expertise in various areas ranging from programming to management to game theory to politics to economics have been making some serious, mature, compelling arguments.
But, as I've been saying, the only problem to me is: in many of these cases, these arguments are vehemently in opposition to each other! So I find myself agreeing with pretty much all of them, one by one - which means the end result is just a giant contradiction.
I mean, today we have Bram Cohen, the inventor of BitTorrent, arguing (quite cogently and convincingly to me), that it would be dangerous to increase the blocksize. And this seems to be a guy who would know a few things about scaling out a massive global p2p network - since the protocol which he invented, BitTorrent, is now apparently responsible for like a third of the traffic on the internet (and this despite the long-term concerted efforts of major evil players such as the MPAA and the FBI to shut the whole thing down).
Was the BitTorrent analogy too "glib"?
By the way - I would like to go on a slight tangent here and say that one of the main reasons why I felt so "comfortable" jumping on the Bitcoin train back a few years ago, when I first heard about it and got into it, was the whole rough analogy I saw with BitTorrent.
I remembered the perhaps paradoxical fact that when a torrent is more popular (eg, a major movie release that just came out last week), then it actually becomes faster to download. More people want it, so more people have a few pieces of it, so more people are able to get it from each other. A kind of self-correcting economic feedback loop, where more demand directly leads to more supply.
(BitTorrent manages to pull this off by essentially adding a certain structure to the file being shared, so that it's not simply like an append-only list of 1 MB blocks, but rather more like an random-access or indexed array of 1 MB chunks. Say you're downloading a film which is 700 MB. As soon as your "client" program has downloaded a single 1-MB chunk - say chunk #99 - your "client" program instantly turns into a "server" program as well - offering that chunk #99 to other clients. From my simplistic understanding, I believe the Bitcoin protocol does something similar, to provide a p2p architecture. Hence my - perhaps naïve - assumption that Bitcoin already had the right algorithms / architecture / data structure to scale.)
The efficiency of the BitTorrent network seemed to jive with that "network law" (Metcalfe's Law?) about fax machines. This law states that the more fax machines there are, the more valuable the network of fax machines becomes. Or the value of the network grows on the order of the square of the number of nodes.
This is in contrast with other technology like cars, where the more you have, the worse things get. The more cars there are, the more traffic jams you have, so things start going downhill. I guess this is because highway space is limited - after all, we can't pave over the entire countryside, and we never did get those flying cars we were promised, as David Graeber laments in a recent essay in The Baffler magazine :-)
And regarding the "stress test" supposedly happening right now in the middle of this ongoing blocksize debate, I don't know what worries me more: the fact that it apparently is taking only $5,000 to do a simple kind of DoS on the blockchain - or the fact that there are a few rumors swirling around saying that the unknown company doing the stress test shares the same physical mailing address with a "scam" company?
Or maybe we should just be worried that so much of this debate is happening on a handful of forums which are controlled by some guy named theymos who's already engaged in some pretty "contentious" or "controversial" behavior like blowing a million dollars on writing forum software (I guess he never heard that reddit.com software is open-source)?
So I worry that the great promise of "decentralization" might be more fragile than we originally thought.
Anyways, back to Metcalfe's Law: with virtual stuff, like torrents and fax machines, the more the merrier. The more people downloading a given movie, the faster it arrives - and the more people own fax machines, the more valuable the overall fax network.
So I kindof (naïvely?) assumed that Bitcoin, being "virtual" and p2p, would somehow scale up the same magical way BitTorrrent did. I just figured that more people using it would somehow automatically make it stronger and faster.
But now a lot of devs have started talking in terms of the old "scarcity" paradigm, talking about blockspace being a "scarce resource" and talking about "fee markets" - which seems kinda scary, and antithetical to much of the earlier rhetoric we heard about Bitcoin (the stuff about supporting our favorite creators with micropayments, and the stuff about Africans using SMS to send around payments).
Look, when some asshole is in line in front of you at the cash register and he's holding up the line so they can run his credit card to buy a bag of Cheeto's, we tend to get pissed off at the guy - clogging up our expensive global electronic payment infrastructure to make a two-dollar purchase. And that's on a fairly efficient centralized system - and presumably after a year or so, VISA and the guy's bank can delete or compress the transaction in their SQL databases.
Now, correct me if I'm wrong, but if some guy buys a coffee on the blockchain, or if somebody pays an online artist $1.99 for their work - then that transaction, a few bytes or so, has to live on the blockchain forever?
Or is there some "pruning" thing that gets rid of it after a while?
And this could lead to another question: Viewed from the perspective of double-entry bookkeeping, is the blockchain "world-wide ledger" more like the "balance sheet" part of accounting, i.e. a snapshot showing current assets and liabilities? Or is it more like the "cash flow" part of accounting, i.e. a journal showing historical revenues and expenses?
When I think of thousands of machines around the globe having to lug around multiple identical copies of a multi-gigabyte file containing some asshole's coffee purchase forever and ever... I feel like I'm ideologically drifting in one direction (where I'd end up also being against really cool stuff like online micropayments and Africans banking via SMS)... so I don't want to go there.
But on the other hand, when really experienced and battle-tested veterans with major experience in the world of open-souce programming and project management (the "small-blockians") warn of the catastrophic consequences of a possible failed hard-fork, I get freaked out and I wonder if Bitcoin really was destined to be a settlement layer for big transactions.
Could the original programmer(s) possibly weigh in?
And I don't mean to appeal to authority - but heck, where the hell is Satoshi Nakamoto in all this? I do understand that he/she/they would want to maintain absolute anonymity - but on the other hand, I assume SN wants Bitcoin to succeed (both for the future of humanity - or at least for all the bitcoins SN allegedly holds :-) - and I understand there is a way that SN can cryptographically sign a message - and I understand that as the original developer of Bitcoin, SN had some very specific opinions about the blocksize... So I'm kinda wondering of Satoshi could weigh in from time to time. Just to help out a bit. I'm not saying "Show us a sign" like a deity or something - but damn it sure would be fascinating and possibly very helpful if Satoshi gave us his/hetheir 2 satoshis worth at this really confusing juncture.
Are we using our capacity wisely?
I'm not a programming or game-theory whiz, I'm just a casual user who has tried to keep up with technology over the years.
It just seems weird to me that here we have this massive supercomputer (500 times more powerful than the all the supercomputers in the world combined) doing fairly straightforward "embarassingly parallel" number-crunching operations to secure a p2p world-wide ledger called the blockchain to keep track of a measly 2.1 quadrillion tokens spread out among a few billion addresses - and a couple of years ago you had people like Rick Falkvinge saying the blockchain would someday be supporting multi-million-dollar letters of credit for international trade and you had people like Andreas Antonopoulos saying the blockchain would someday allow billions of "unbanked" people to send remittances around the village or around the world dirt-cheap - and now suddenly in June 2015 we're talking about blockspace as a "scarce resource" and talking about "fee markets" and partially centralized, corporate-sponsored "Level 2" vaporware like Lightning Network and some mysterious company is "stess testing" or "DoS-ing" the system by throwing away a measly $5,000 and suddenly it sounds like the whole system could eventually head right back into PayPal and Western Union territory again, in terms of expensive fees.
When I got into Bitcoin, I really was heavily influenced by vague analogies with BitTorrent: I figured everyone would just have tiny little like utorrent-type program running on their machine (ie, Bitcoin-QT or Armory or Mycelium etc.).
I figured that just like anyone can host a their own blog or webserver, anyone would be able to host their own bank.
Yeah, Google and and Mozilla and Twitter and Facebook and WhatsApp did come along and build stuff on top of TCP/IP, so I did expect a bunch of companies to build layers on top of the Bitcoin protocol as well. But I still figured the basic unit of bitcoin client software powering the overall system would be small and personal and affordable and p2p - like a bittorrent client - or at the most, like a cheap server hosting a blog or email server.
And I figured there would be a way at the software level, at the architecture level, at the algorithmic level, at the data structure level - to let the thing scale - if not infinitely, at least fairly massively and gracefully - the same way the BitTorrent network has.
Of course, I do also understand that with BitTorrent, you're sharing a read-only object (eg, a movie) - whereas with Bitcoin, you're achieving distributed trustless consensus and appending it to a write-only (or append-only) database.
So I do understand that the problem which BitTorrent solves is much simpler than the problem which Bitcoin sets out to solve.
But still, it seems that there's got to be a way to make this thing scale. It's p2p and it's got 500 times more computing power than all the supercomputers in the world combined - and so many brilliant and motivated and inspired people want this thing to succeed! And Bitcoin could be our civilization's last chance to steer away from the oncoming debt-based ditch of disaster we seem to be driving into!
It just seems that Bitcoin has got to be able to scale somehow - and all these smart people working together should be able to come up with a solution which pretty much everyone can agree - in advance - will work.
Right? Right?
A (probably irrelevant) tangent on algorithms and architecture and data structures
I'll finally weigh with my personal perspective - although I might be biased due to my background (which is more on the theoretical side of computer science).
My own modest - or perhaps radical - suggestion would be to ask whether we're really looking at all the best possible algorithms and architectures and data structures out there.
From this perspective, I sometimes worry that the overwhelming majority of the great minds working on the programming and game-theory stuff might come from a rather specific, shall we say "von Neumann" or "procedural" or "imperative" school of programming (ie, C and Python and Java programmers).
It seems strange to me that such a cutting-edge and important computer project would have so little participation from the great minds at the other end of the spectrum of programming paradigms - namely, the "functional" and "declarative" and "algebraic" (and co-algebraic!) worlds.
For example, I was struck in particular by statements I've seen here and there (which seemed rather hubristic or lackadaisical to me - for something as important as Bitcoin), that the specification of Bitcoin and the blockchain doesn't really exist in any form other than the reference implementation(s) (in procedural languages such as C or Python?).
Curry-Howard anyone?
I mean, many computer scientists are aware of the Curry-Howard isomorophism, which basically says that the relationship between a theorem and its proof is equivalent to the relationship between a specification and its implementation. In other words, there is a long tradition in mathematics (and in computer programming) of:
And it's not exactly "turtles all the way down" either: a specification is generally simple and compact enough that a good programmer can usually simply visually inspect it to determine if it is indeed "correct" - something which is very difficult, if not impossible, to do with a program written in a procedural, implementation-oriented language such as C or Python or Java.
So I worry that we've got this tradition, from the open-source github C/Java programming tradition, of never actually writing our "specification", and only writing the "implementation". In mission-critical military-grade programming projects (which often use languages like Ada or Maude) this is simply not allowed. It would seem that a project as mission-critical as Bitcoin - which could literally be crucial for humanity's continued survival - should also use this kind of military-grade software development approach.
And I'm not saying rewrite the implementations in these kind of theoretical languages. But it might be helpful if the C/Python/Java programmers in the Bitcoin imperative programming world could build some bridges to the Maude/Haskell/ML programmers of the functional and algebraic programming worlds to see if any kind of useful cross-pollination might take place - between specifications and implementations.
For example, the JavaFAN formal analyzer for multi-threaded Java programs (developed using tools based on the Maude language) was applied to the Remote Agent AI program aboard NASA's Deep Space 1 shuttle, written in Java - and it took only a few minutes using formal mathematical reasoning to detect a potential deadlock which would have occurred years later during the space mission when the damn spacecraft was already way out around Pluto.
And "the Maude-NRL (Naval Research Laboratory) Protocol Analyzer (Maude-NPA) is a tool used to provide security proofs of cryptographic protocols and to search for protocol flaws and cryptosystem attacks."
These are open-source formal reasoning tools developed by DARPA and used by NASA and the US Navy to ensure that program implementations satisfy their specifications. It would be great if some of the people involved in these kinds of projects could contribute to help ensure the security and scalability of Bitcoin.
But there is a wide abyss between the kinds of programmers who use languages like Maude and the kinds of programmers who use languages like C/Python/Java - and it can be really hard to get the two worlds to meet. There is a bit of rapprochement between these language communities in languages which might be considered as being somewhere in the middle, such as Haskell and ML. I just worry that Bitcoin might be turning into being an exclusively C/Python/Java project (with the algorithms and practitioners traditionally of that community), when it could be more advantageous if it also had some people from the functional and algebraic-specification and program-verification community involved as well. The thing is, though: the theoretical practitioners are big on "semantics" - I've heard them say stuff like "Yes but a C / C++ program has no easily identifiable semantics". So to get them involved, you really have to first be able to talk about what your program does (specification) - before proceeding to describe how it does it (implementation). And writing high-level specifications is typically very hard using the syntax and semantics of languages like C and Java and Python - whereas specs are fairly easy to write in Maude - and not only that, they're executable, and you state and verify properties about them - which provides for the kind of debate Nick Szabo was advocating ("more computer science, less noise").
Imagine if we had an executable algebraic specification of Bitcoin in Maude, where we could formally reason about and verify certain crucial game-theoretical properties - rather than merely hand-waving and arguing and deploying and praying.
And so in the theoretical programming community you've got major research on various logics such as Girard's Linear Logic (which is resource-conscious) and Bruni and Montanari's Tile Logic (which enables "pasting" bigger systems together from smaller ones in space and time), and executable algebraic specification languages such as Meseguer's Maude (which would be perfect for game theory modeling, with its functional modules for specifying the deterministic parts of systems and its system modules for specifiying non-deterministic parts of systems, and its parameterized skeletons for sketching out the typical architectures of mobile systems, and its formal reasoning and verification tools and libraries which have been specifically applied to testing and breaking - and fixing - cryptographic protocols).
And somewhat closer to the practical hands-on world, you've got stuff like Google's MapReduce and lots of Big Data database languages developed by Google as well. And yet here we are with a mempool growing dangerously big for RAM on a single machine, and a 20-GB append-only list as our database - and not much debate on practical results from Google's Big Data databases.
(And by the way: maybe I'm totally ignorant for asking this, but I'll ask anyways: why the hell does the mempool have to stay in RAM? Couldn't it work just as well if it were stored temporarily on the hard drive?)
And you've got CalvinDB out of Yale which apparently provides an ACID layer on top of a massively distributed database.
Look, I'm just an armchair follower cheering on these projects. I can barely manage to write a query in SQL, or read through a C or Python or Java program. But I would argue two points here: (1) these languages may be too low-level and "non-formal" for writing and modeling and formally reasoning about and proving properties of mission-critical specifications - and (2) there seem to be some Big Data tools already deployed by institutions such as Google and Yale which support global petabyte-size databases on commodity boxes with nice properties such as near-real-time and ACID - and I sometimes worry that the "core devs" might be failing to review the literature (and reach out to fellow programmers) out there to see if there might be some formal program-verification and practical Big Data tools out there which could be applied to coming up with rock-solid, 100% consensus proposals to handle an issue such as blocksize scaling, which seems to have become much more intractable than many people might have expected.
I mean, the protocol solved the hard stuff: the elliptical-curve stuff and the Byzantine General stuff. How the heck can we be falling down on the comparatively "easier" stuff - like scaling the blocksize?
It just seems like defeatism to say "Well, the blockchain is already 20-30 GB and it's gonna be 20-30 TB ten years from now - and we need 10 Mbs bandwidth now and 10,000 Mbs bandwidth 20 years from - assuming the evil Verizon and AT&T actually give us that - so let's just become a settlement platform and give up on buying coffee or banking the unbanked or doing micropayments, and let's push all that stuff into some corporate-controlled vaporware without even a whitepaper yet."
So you've got Peter Todd doing some possibly brilliant theorizing and extrapolating on the idea of "treechains" - there is a Let's Talk Bitcoin podcast from about a year ago where he sketches the rough outlines of this idea out in a very inspiring, high-level way - although the specifics have yet to be hammered out. And we've got Blockstream also doing some hopeful hand-waving about the Lightning Network.
Things like Peter Todd's treechains - which may be similar to the spark in some devs' eyes called Lightning Network - are examples of the kind of algorithm or architecture which might manage to harness the massive computing power of miners and nodes in such a way that certain kinds of massive and graceful scaling become possible.
It just seems like a kindof tiny dev community working on this stuff.
Being a C or Python or Java programmer should not be a pre-req to being able to help contribute to the specification (and formal reasoning and program verification) for Bitcoin and the blockchain.
XML and UML are crap modeling and specification languages, and C and Java and Python are even worse (as specification languages - although as implementation languages, they are of course fine).
But there are serious modeling and specification languages out there, and they could be very helpful at times like this - where what we're dealing with is questions of modeling and specification (ie, "needs and requirements").
One just doesn't often see the practical, hands-on world of open-source github implementation-level programmers and the academic, theoretical world of specification-level programmers meeting very often. I wish there were some way to get these two worlds to collaborate on Bitcoin.
Maybe a good first step to reach out to the theoretical people would be to provide a modular executable algebraic specification of the Bitcoin protocol in a recognized, military/NASA-grade specification language such as Maude - because that's something the theoretical community can actually wrap their heads around, whereas it's very hard to get them to pay attention to something written only as a C / Python / Java implementation (without an accompanying specification in a formal language).
They can't check whether the program does what it's supposed to do - if you don't provide a formal mathematical definition of what the program is supposed to do.
Specification : Implementation :: Theorem : Proof
You have to remember: the theoretical community is very aware of the Curry-Howard isomorphism. Just like it would be hard to get a mathematician's attention by merely showing them a proof without telling also telling them what theorem the proof is proving - by the same token, it's hard to get the attention of a theoretical computer scientist by merely showing them an implementation without showing them the specification that it implements.
Bitcoin is currently confronted with a mathematical or "computer science" problem: how to secure the network while getting high enough transactional throughput, while staying within the limited RAM, bandwidth and hard drive space limitations of current and future infrastructure.
The problem only becomes a political and economic problem if we give up on trying to solve it as a mathematical and "theoretical computer science" problem.
There should be a plethora of whitepapers out now proposing algorithmic solutions to these scaling issues. Remember, all we have to do is apply the Byzantine General consensus-reaching procedure to a worldwide database which shuffles 2.1 quadrillion tokens among a few billion addresses. The 21 company has emphatically pointed out that racing to compute a hash to add a block is an "embarrassingly parallel" problem - very easy to decompose among cheap, fault-prone, commodity boxes, and recompose into an overall solution - along the lines of Google's highly successful MapReduce.
I guess what I'm really saying is (and I don't mean to be rude here), is that C and Python and Java programmers might not be the best qualified people to develop and formally prove the correctness of (note I do not say: "test", I say "formally prove the correctness of") these kinds of algorithms.
I really believe in the importance of getting the algorithms and architectures right - look at Google Search itself, it uses some pretty brilliant algorithms and architectures (eg, MapReduce, Paxos) which enable it to achieve amazing performance - on pretty crappy commodity hardware. And look at BitTorrent, which is truly p2p, where more demand leads to more supply.
So, in this vein, I will close this lengthy rant with an oddly specific link - which may or may not be able to make some interesting contributions to finding suitable algorithms, architectures and data structures which might help Bitcoin scale massively. I have no idea if this link could be helpful - but given the near-total lack of people from the Haskell and ML and functional worlds in these Bitcoin specification debates, I thought I'd be remiss if I didn't throw this out - just in case there might be something here which could help us channel the massive computing power of the Bitcoin network in such a way as to enable us simply sidestep this kind of desperate debate where both sides seem right because the other side seems wrong.
The above paper is about "higher dimensional trees". It uses a bit of category theory (not a whole lot) and a bit of Haskell (again not a lot - just a simple data structure called a Rose tree, which has a wikipedia page) to develop a very expressive and efficient data structure which generalizes from lists to trees to higher dimensions.
I have no idea if this kind of data structure could be applicable to the current scaling mess we apparently are getting bogged down in - I don't have the game-theory skills to figure it out.
I just thought that since the blockchain is like a list, and since there are some tree-like structures which have been grafted on for efficiency (eg Merkle trees) and since many of the futuristic scaling proposals seem to also involve generalizing from list-like structures (eg, the blockchain) to tree-like structures (eg, side-chains and tree-chains)... well, who knows, there might be some nugget of algorithmic or architectural or data-structure inspiration there.
So... TL;DR:
(1) I'm freaked out that this blocksize debate has splintered the community so badly and dragged on so long, with no resolution in sight, and both sides seeming so right (because the other side seems so wrong).
(2) I think Bitcoin could gain immensely by using high-level formal, algebraic and co-algebraic program specification and verification languages (such as Maude including Maude-NPA, Mobile Maude parameterized skeletons, etc.) to specify (and possibly also, to some degree, verify) what Bitcoin does - before translating to low-level implementation languages such as C and Python and Java saying how Bitcoin does it. This would help to communicate and reason about programs with much more mathematical certitude - and possibly obviate the need for many political and economic tradeoffs which currently seem dismally inevitable - and possibly widen the collaboration on this project.
(3) I wonder if there are some Big Data approaches out there (eg, along the lines of Google's MapReduce and BigTable, or Yale's CalvinDB), which could be implemented to allow Bitcoin to scale massively and painlessly - and to satisfy all stakeholders, ranging from millionaires to micropayments, coffee drinkers to the great "unbanked".
submitted by BeYourOwnBank to Bitcoin [link] [comments]

BlockTorrent: The famous algorithm which BitTorrent uses for SHARING BIG FILES. Which you probably thought Bitcoin *also* uses for SHARING NEW BLOCKS (which are also getting kinda BIG). But Bitcoin *doesn't* torrent *new* blocks (while relaying). It only torrents *old* blocks (while sync-ing). Why?

This post is being provided to further disseminate an existing proposal:
This proposal was originally presented by jtoomim back in September of 2015 - on the bitcoin_dev mailing list (full text at the end of this OP), and on reddit:
Here's a TL;DR, in his words:
For initial block sync, [Bitcoin] sort of works [like BitTorrent] already.
You download a different block from each peer. That's fine.
However, a mechanism does not currently exist for downloading a portion of each [new] block from a different peer.
That's what I want to add.
~ jtoomim
The more detailed version of this "BlockTorrenting" proposal (as presented by jtoomim on the bitcoin_dev mailing list) is linked and copied / reformatted at the end of this OP.
Meanwhile here are some observations from me as a concerned member of the Bitcoin-using public.
Bitcoin doesn't do this kind of "blocktorrenting" already??
But.. But... I thought Bitcoin was "p2p" and "based on BitTorrent"...
... because (as we all know) Bitcoin has to download giant files.
Bitcoin only "torrents" when sharing one certain kind of really big file: the existing blockchain, when a node is being initialized.
But Bitcoin doesn't "torrent" when sharing another certain kind of moderately big file (a file whose size, by the way, has been notoriously and steadily growing over the years to the point where the system running the legacy "Core"/Blockstream Bitcoin implementation is starting to become dangerously congested - no matter what some delusional clowns "Core" devs may say): ie, the world's most wildly popular, industrial-strength "p2p file sharing algorithm" is mysteriously not being used where the Bitcoin network needs it the most in order to get transactions confirmed on-chain: when a a newly found block needs to be shared among nodes, when a node is relaying new blocks.
How many of you (honestly) just simply assumed that this algorithm was already being used in Bitcoin - since we've all been told that "Bitcoin is p2p, like BitTorrent"?
As it turns out - the only part of Bitcoin which has been p2p up until now is the "sync-ing a new full-node" part.
The "running an existing full-node" part of Bitcoin has never been implemented as truly "p2p2" yet!!!1!!!
And this is precisely the part of the system that we've been wasting all of our time (and destroying the community) fighting over for the past few months - because the so-called "experts" from the legacy "Core"/Blockstream Bitcoin implementation ignored this proposal!
Why have all the so-called "experts" at "Core"/Blockstream ignored this obvious well-known effective & popular & tested & successful algorithm for doing "blocktorrenting" to torrent each new block being relayed?
Why have the "Core"/Blockstream devs failed to p2p-ize the most central, fundamental networking aspect of Bitcoin - the part where blocks get propagated, the part we've been fighting about for the past few years?
This algorithm for "torrenting" a big file in parallel from peers is the very definition of "p2p".
It "surgically" attacks the whole problem of sharing big files in the most elegant and efficient way possible: right at the lowest level of the bottleneck itself, cleverly chunking a file and uploading it in parallel to multiple peers.
Everyone knows torrenting works. Why isn't Bitcoin using it for its new blocks?
As millions of torrenters already know (but evidently all the so-called "experts" at Core/Blocsktream seem to have conveniently forgotten), "torrenting" a file (breaking a file into chunks and then offering a different chunk to each peer to "get it out to everyone fast" - before your particular node even has the entire file) is such a well-known / feasible / obvious / accepted / battle-tested / highly efficient algorithm for "parallelizing" (and thereby significantly accelerating) the sharing of big files among peers, that many people simply assumed that Bitcoin had already been doing this kind of "torrenting of new-blocks" these whole past 7 years.
But Bitcoin doesn't do this - yet!
None of the Core/Blockstream devs (and the Chinese miners who follow them) have prioritized p2p-izing the most central and most vital and most resource-consuming function of the Bitcoin network - the propagation of new blocks!
Maybe it took someone who's both a miner and a dev to "scratch" this particular "itch": Jonathan Toomim jtoomim.
  • A miner + dev who gets very little attention / respect from the Core/Blockstream devs (and from the Chinese miners who follow them) - perhaps because they feel threatened by a competing implementation?
  • A miner + dev who may have come up with the simplest and safest and most effective algorithmic (ie, software-based, not hardware-consuming) scaling proposal of anyone!
  • A dev who who is not paid by Blockstream, and who is therefore free from the secret, undisclosed corporate restraints / confidentiality agreements imposed by the shadowy fiat venture-capitalists and legacy power elite who appear to be attempting to cripple our code and muzzle our devs.
  • A miner who has the dignity not to let himself be forced into signing a loyalty oath to any corporate overlords after being locked in a room until 3 AM.
Precisely because jtoomim is both a indepdendent miner and an independent dev...
  • He knows what needs to be done.
  • He knows how to do it.
  • He is free to go ahead and do it - in a permissionless, decentralized fashion.
Possible bonus: The "blocktorrent" algorithm would help the most in the upload direction - which is precisely where Bitcoin scaling needs the most help!
Consider the "upload" direction for a relatively slow full-node - such as Luke-Jr, who reports that his internet is so slow, he has not been able to run a full-node since mid-2015.
The upload direction is the direction which everyone says has been the biggest problem with Bitcoin - because, in order for a full-node to be "useful" to the network:
  • it has to able to upload a new block to (at least) 8 peers,
  • which places (at least) 8x more "demand" on the full-node's upload bandwidth.
The brilliant, simple proposed "blocktorrent" algorithm from jtoomim (already proven to work with Bram Cohen's BitTorrent protocol, and also already proven to work for initial sync-ing of Bitcoin full-nodes - but still un-implemented for ongoing relaying among full-nodes) looks like it would provide a significant performance improvement precisely at this tightest "bottleneck" in the system, the crucial central nexus where most of the "traffic" (and the congestion) is happening: the relaying of new blocks from "slower" full-nodes.
The detailed explanation for how this helps "slower" nodes when uploading, is as follows.
Say you are a "slower" node.
You need to send a new block out to (at least) 8 peers - but your "upload" bandwidth is really slow.
If you were to split the file into (at least) 8 "chunks", and then upload a different one of these (at least) 8 "chunks" to each of your (at least) 8 peers - then (if you were using "blocktorrenting") it only would take you 1/8 (or less) of the "normal" time to do this (compared to the naïve legacy "Core" algorithm).
Now the new block which your "slower" node was attempting to upload is already "out there" - in 1/8 (or less) of the "normal" time compared to the naïve legacy "Core" algorithm.[ 1 ]
... [ 1 ] There will of course also be a tiny amount of extra overhead involved due to the "housekeeping" performed by the "blocktorrent" algorithm itself - involving some additional processing and communicating to decompose the block into chunks and to organize the relaying of different chunks to different peers and the recompose the chunks into a block again (all of which, depending on the size of the block and the latency of your node's connections to its peers, would in most cases be negligible compared to the much-greater speed-up provided by the "blocktorrent" algorithm itself).
Now that your block is "out there" at those 8 (or more) peer nodes to whom you just blocktorrented it in 1/8 (or less) of the time - it has now been liberated from the "bottleneck" of your "slower" node.
In fact, its further propagation across the net may now be able to leverage much faster upload speeds from some other node(s) which have "blocktorrent"-downloaded it in pieces from you (and other peers) - and which might be faster relaying it along, than your "slower" node.
For some mysterious reason, the legacy Bitcoin implementation from "Core"/Blockstream has not been doing this kind of "blocktorrenting" for new blocks.
It's only been doing this torrenting for old blocks. The blocks that have already been confirmed.
Which is fine.
But we also obviously need this sort of "torrenting" to be done for each new block is currently being confirmed.
And this is where the entire friggin' "scaling bottleneck" is occurring, which we just wasted the past few years "debating" about.
Just sit down and think about this for a minute.
We've had all these so-called "experts" (Core/Blockstream devs and other small-block proponents) telling us for years that guys like Hearn and Gavin and repos like Classic and XT and BU were "wrong" or at least "unserious" because they "merely" proposed "brute-force" scaling: ie, scaling which would simply place more demands on finite resources (specifically: on the upload bandwidth from full-nodes - who need to relay to at least 8 peer full-nodes in order to be considered "useful" to the network).
These "experts" have been beating us over the head this whole time, telling us that we have to figure out some (really complicated, unproven, inefficient and centralized) clever scaling algorithms to squeeze more efficiency out of existing infrastructure.
And here is the most well-known / feasible / obvious / accepted / battle-tested algorithm for "parallelizing" (and thereby massively accelerating) the sharing of big file among peers - the BitTorrent algorithm itself, the gold standard of p2p relaying par excellence, which has been a major success on the Internet for decades, at one point accounting for nearly 1/3 of all traffic on the Internet itself - and which is also already being used in one part of Bitcoin: during the phase of sync-ing a new node.
And apparently pretty much only jtoomim has been talking about using it for the actual relaying of new blocks - while Core/Blockstream devs have so far basically ignored this simple and safe and efficient proposal.
And then the small-block sycophants (reddit users or wannabe C/C++ programmers who have beaten into submission and/or by the FUD and "technological pessimism" of the Core/Blockstream devs, and by the censorhip on their legacy forum), they all "laugh" at Classic and proclaim "Bitcoin doesn't need another dev team - all the 'experts' are at Core / Blockstream"...
...when in fact it actually looks like jtoomim (an independent minedev, free from the propaganda and secret details of the corporate agenda of Core/Blockstream - who works on the Classic Bitcoin implementation) may have proposed the simplest and safest and most effective scaling algorithm in this whole debate.
By the way, his proposal estimates that we could get about 1 magnitude greater throughput, based on the typical latency and blocksize for blocks of around 20 MB and bandwidth of around 8 Mbps (which seems like a pretty normal scenario).
So why the fuck isn't this being done yet?
This is such a well-known / feasible / obvious / accepted / battle-tested algorithm for "parallelizing" (and thereby significantly accelerating) the sharing of big files among peers:
  • It's already being used for the (currently) 65 gigabytes of "blocks in the existing blockchain" itself - the phase where a new node has to sync with the blockchain.
  • It's already being used in BitTorrent - although the BitTorrent protocol has been optimized more to maximize throughput, whereas it would probably be a good idea to optimize the BlockTorrent protocol to minimize latency (since avoiding orphans is the big issue here) - which I'm fairly sure should be quite doable.
This algorithm is so trivial / obvious / straightforward / feasible / well-known / proven that I (and probably many others) simply assumed that Bitcoin had been doing this all along!
But it has never been implemented.
There is however finally a post about it today on the score-hidden forum /Bitcoin, from eragmus:
[bitcoin-dev] BlockTorrent: Torrent-style new-block propagation on Merkle trees
And, predictably, the top-voted comment there is a comment telling us why it will never work.
And the comment after that comment is from the author of the proposal, jtoomim, explaining why it would work.
Score hidden on all those comments.
Because the immature tyrant theymos still doesn't understand the inherent advantages of people using reddit's upvoting & downvoting tools to hold decentralized, permissionless debates online.
(1) Would this "BlockTorrenting" algorithm from jtoomim really work?
(2) If so, why hasn't it been implemented yet?
(3) Specifically: With all the "dev firepower" (and $76 million in venture capital) available at Core/Blockstream, why have they not prioritized implementing this simple and safe and highly effective solution?
(4) Even more specifically: Are there undisclosed strategies / agreements / restraints imposed by Blockstream financial investors on Bitcoin "Core" devs which have been preventing further discussion and eventual implementation of this possible simple & safe & efficient scaling solution?
Here is the more-detailed version of this proposal, presented by Jonathan Toomim jtoomim back in September of 2015 on the bitcoin-dev mailing list (and pretty much ignored for months by almost all the "experts" there):
As I understand it, the current block propagation algorithm is this:
  1. A node mines a block.
  2. It notifies its peers that it has a new block with an inv. Typical nodes have 8 peers.
  3. The peers respond that they have not seen it, and request the block with getdata [hash].
  4. The node sends out the block in parallel to all 8 peers simultaneously. If the node's upstream bandwidth is limiting, then all peers will receive most of the block before any peer receives all of the block. The block is sent out as the small header followed by a list of transactions.
  5. Once a peer completes the download, it verifies the block, then enters step 2.
(If I'm missing anything, please let me know.)
The main problem with this algorithm is that it requires a peer to have the full block before it does any uploading to other peers in the p2p mesh. This slows down block propagation to:
O( p • log_p(n) ) 
  • n is the number of peers in the mesh,
  • p is the number of peers transmitted to simultaneously.
It's like the Napster era of file-sharing. We can do much better than this.
Bittorrent can be an example for us.
Bittorrent splits the file to be shared into a bunch of chunks, and hashes each chunk.
Downloaders (leeches) grab the list of hashes, then start requesting their peers for the chunks out-of-order.
As each leech completes a chunk and verifies it against the hash, it begins to share those chunks with other leeches.
Total propagation time for large files can be approximately equal to the transmission time for an FTP upload.
Sometimes it's significantly slower, but often it's actually faster due to less bottlenecking on a single connection and better resistance to packet/connection loss.
(This could be relevant for crossing the Chinese border, since the Great Firewall tends to produce random packet loss, especially on encrypted connections.)
Bitcoin uses a data structure for transactions with hashes built-in. We can use that in lieu of Bittorrent's file chunks.
A Bittorrent-inspired algorithm might be something like this:
  1. (Optional steps to build a Merkle cache; described later)
  2. A seed node mines a block.
  3. It notifies its peers that it has a new block with an extended version of inv.
  4. The leech peers request the block header.
  5. The seed sends the block header. The leech code path splits into two.
  6. (a) The leeches verify the block header, including the PoW. If the header is valid,
  7. (a) They notify their peers that they have a header for an unverified new block with an extended version of inv, looping back to 2. above. If it is invalid, they abort thread (b).
  8. (b) The leeches request the Nth row (from the root) of the transaction Merkle tree, where N might typically be between 2 and 10. That corresponds to about 1/4th to 1/1024th of the transactions. The leeches also request a bitfield indicating which of the Merkle nodes the seed has leaves for. The seed supplies this (0xFFFF...).
  9. (b) The leeches calculate all parent node hashes in the Merkle tree, and verify that the root hash is as described in the header.
  10. The leeches search their Merkle hash cache to see if they have the leaves (transaction hashes and/or transactions) for that node already.
  11. The leeches send a bitfield request to the node indicating which Merkle nodes they want the leaves for.
  12. The seed responds by sending leaves (either txn hashes or full transactions, depending on benchmark results) to the leeches in whatever order it decides is optimal for the network.
  13. The leeches verify that the leaves hash into the ancestor node hashes that they already have.
  14. The leeches begin sharing leaves with each other.
  15. If the leaves are txn hashes, they check their cache for the actual transactions. If they are missing it, they request the txns with a getdata, or all of the txns they're missing (as a list) with a few batch getdatas.
Features and benefits
The main feature of this algorithm is that a leech will begin to upload chunks of data as soon as it gets them and confirms both PoW and hash/data integrity instead of waiting for a fully copy with full verification.
Inefficient cases, and mitigations
This algorithm is more complicated than the existing algorithm, and won't always be better in performance.
Because more round trip messages are required for negotiating the Merkle tree transfers, it will perform worse in situations where the bandwidth to ping latency ratio is high relative to the blocksize.
Specifically, the minimum per-hop latency will likely be higher.
This might be mitigated by reducing the number of round-trip messages needed to set up the BlockTorrent by using larger and more complex inv-like and getdata-like messages that preemptively send some data (e.g. block headers).
This would trade off latency for bandwidth overhead from larger duplicated inv messages.
Depending on implementation quality, the latency for the smallest block size might be the same between algorithms, or it might be 300% higher for the torrent algorithm.
For small blocks (perhaps < 100 kB), the BlockTorrent algorithm will likely be slightly slower.
Sidebar from the OP: So maybe this would discourage certain miners (cough Dow cough) from mining blocks that aren't full enough:
Why is [BTCC] limiting their block size to under 750 all of a sudden?

For large blocks (e.g. 8 MB over 20 Mbps), I expect the BlockTorrent algo will likely be around an order of magnitude faster in the worst case (adversarial) scenarios, in which none of the block's transactions are in the caches.

One of the big benefits of the BlockTorrent algorithm is that it provides several obvious and straightforward points for bandwidth saving and optimization by caching transactions and reconstructing the transaction order.

Future work: possible further optimizations
A cooperating miner [could] pre-announce Merkle subtrees with some of the transactions they are planning on including in the final block.
Other miners who see those subtrees [could] compare the transactions in those subtrees to the transaction sets they are mining with, and can rearrange their block prototypes to use the same subtrees as much as possible.
In the case of public pools supporting the getblocktemplate protocol, it might be possible to build Merkle subtree caches without the pool's help by having one or more nodes just scraping their getblocktemplate results.
Even if some transactions are inserted or deleted, it [might] be possible to guess a lot of the tree based on the previous ordering.
Once a block header and the first few rows of the Merkle tree [had] been published, they [would] propagate through the whole network, at which time full nodes might even be able to guess parts of the tree by searching through their txn and Merkle node/subtree caches.
That might be fun to think about, but probably not effective due to O( n2 ) or worse scaling with transaction count.
Might be able to make it work if the whole network cooperates on it, but there are probably more important things to do.
Leveraging other features from BitTorrent
There are also a few other features of Bittorrent that would be useful here, like:
  • prioritizing uploads to different peers based on their upload capacity,
  • banning peers that submit data that doesn't hash to the right value.
Sidebar from the OP: Hmm...maybe that would be one way to deal with the DDoS-ing we're experiencing right now? I know the DDoSer is using a rotating list of proxies, but still it could be a quick-and-dirty way to mitigate against his attack.
DDoS started again. Have a nice day, guys :)
(It might be good if we could get Bram Cohen to help with the implementation.)
Using the existing BitTorrent algorithm as-is - versus tailoring a new algorithm optimized for Bitcoin
Another possible option would be to just treat the block as a file and literally Bittorrent it.
But I think that there should be enough benefits to integrating it with the existing bitcoin p2p connections and also with using bitcoind's transaction caches and Merkle tree caches to make a native implementation worthwhile.
Also, BitTorrent itself was designed to optimize more for bandwidth than for latency, so we will have slightly different goals and tradeoffs during implementation.
Concerns, possible attacks, mitigations, related work
One of the concerns that I initially had about this idea was that it would involve nodes forwarding unverified block data to other nodes.
At first, I thought this might be useful for a rogue miner or node who wanted to quickly waste the whole network's bandwidth.
However, in order to perform this attack, the rogue needs to construct a valid header with a valid PoW, but use a set of transactions that renders the block as a whole invalid in a manner that is difficult to detect without full verification.
However, it will be difficult to design such an attack so that the damage in bandwidth used has a greater value than the 240 exahashes (and 25.1 BTC opportunity cost) associated with creating a valid header.
Related work: IBLT (Invertible Bloom Lookup Tables)
As I understand it, the O(1) IBLT approach requires that blocks follow strict rules (yet to be fully defined) about the transaction ordering.
If these are not followed, then it turns into sending a list of txn hashes, and separately ensuring that all of the txns in the new block are already in the recipient's mempool.
When mempools are very dissimilar, the IBLT approach performance degrades heavily and performance becomes worse than simply sending the raw block.
This could occur if a node just joined the network, during chain reorgs, or due to malicious selfish miners.
Also, if the mempool has a lot more transactions than are included in the block, the false positive rate for detecting whether a transaction already exists in another node's mempool might get high for otherwise reasonable bucket counts/sizes.
With the BlockTorrent approach, the focus is on transmitting the list of hashes in a manner that propagates as quickly as possible while still allowing methods for reducing the total bandwidth needed.
The BlockTorrent algorithm does not really address how the actual transaction data will be obtained because, once the leech has the list of txn hashes, the standard Bitcoin p2p protocol can supply them in a parallelized and decentralized manner.
submitted by ydtm to btc [link] [comments]

Proof of Space - Chia is a green, environmentally-friendly cryptocurrency (Part 2) Berkeley Bitcoin Meetup - Bram Cohen from Chia Network Chia - an upcoming eco friendly cryptocurrency  Better than bitcoin BitTorrent – Bitcoin BitTorrent Founder to Launch

r/Bitcoin: A community dedicated to Bitcoin, the currency of the Internet. Bitcoin is a distributed, worldwide, decentralized digital money … Vor diesem Hintergrund hat Bram Cohen, der Erfinder des Filesharing-Systems BitTorrent, angekündigt, eine eigene Digitalwährung herauszubringen. Sein "Chia Network" setzt zwar wie Bitcoin auch ... Bram Cohen, BitTorrent’s inventor, was reported to have officially stepped away from the company in mid-August and has continued to pursue his own cryptocurrency project named Chia. However, it appears as though Tron is still interested in Cohen’s works. Bram Cohen Jul 2, 2015 · 5 min read In Bitcoin there’s a practice roughly equivalent to accepting a check which hasn’t been committed by the block chain yet, known as accepting unconfirmed ... BitTorrent Creator Bram Cohen: Bitcoin Miners are Butthurt Over SegWit - Coinjournal (coinjournal.net) ... If miner would have not switched to bitcoin cash. We would have already been at 7k and you would be making the most money. If miners would stick to bitcoin core we would have more stability and probably hit 10k this year. But you moving between both disrupts the growing of bitcoin core ...

[index] [34151] [4649] [50468] [10699] [3684] [20957] [39538] [18589] [39610] [39026]

Proof of Space - Chia is a green, environmentally-friendly cryptocurrency (Part 2)

BitTorrent Creator Launches an Eco-friendly Bitcoin Rival Called Chia Mining a cryptocurrency like Bitcoin is no easy task as it requires expensive rigs which consume an outrageous amount of ... 11/0917 Look who's joining the cryptocurrency scene! The man who created BitTorrent, Bram Cohen, has announced his intentions to enter the blockchain and cryptocurrency world with Chia! What makes ... BitTorrent inventor Bram Cohen announces eco-friendly bitcoin competitor Chia A bitcoin transaction wastes as much electricity as it takes to power an American home for a week, and legendary coder ... Forget mining! There are new ways to farm crypto ZDNet - Duration: 9 ... Berkeley Bitcoin Meetup - Bram Cohen from Chia Network - Duration: 53:38. Blockchain at Berkeley 690 views. 53:38 . I ... If you enjoy STEAL THIS SHOW, become a supporter!. Our Patreon campaign keeps us free and independent – and fresh shows coming your way! In this episode, we hang out with Bram Cohen, the ...