I'm hopeful that future cryptocurrencies won't be so energy-intensive to mine. Bitcoin is already a non-negligible contributor to global CO2 emissions, believe it or not.
Mining is generally considered to be inherently power hungry but it need not be. It’s a consequence of making the proof of work computationally intensive. If computation is minimized in favor of random access to gigabytes of memory (incurring long latencies), then mining will require large investments in RAM but relatively little power.
Cuckoo Cycle represents a breakthrough in three important ways:
1) it performs only one very cheap siphash computation for about 3.3 random accesses to memory,
2) its memory requirement can be set arbitrarily and doesn't allow for any time-memory trade-off.
3) verification of the proof of work is instant, requiring 1 sha256 and 42 siphash computations.
Runtime in Cuckoo Cycle is completely dominated by memory latency. It promotes the use of commodity general-purpose hardware over custom designed single-purpose hardware.
Other features:
4) proofs take the form of a length 42 cycle in the Cuckoo graph.
5) it has a natural notion of (base) difficulty, namely the number of edges in the graph; above about 60% of size, a 42-cycle is almost guaranteed, but below 50% the probability starts to fall sharply.
6) running time for the current implementation on high end x86 is under 24s/GB single-threaded, and under 3s/GB for 12 threads.
7) making cuckoo use a significant fraction of the typical memory of a botnet computer will send it into swap-hell, and likely alert its owner.
The next complain will be that the cryptocurrencies are so memory intensive that the RAM production is a non-negligible contributor to global CO2 emissions.
What's the defense against someone with a lot of money that can buy an ASIC farm or a memory farm or a whatever farm that is needed for the computation? How much money should the chip builders charge for that? How many energy can be used in the production of the critical item?
Essentially, if 50 BCT cost ~$25000, to create them you must just burn ~$25000 of petrol, i.e. 250 barrels. If you burn only 200 barrels, someone will find a faster method burning 201 that is still profitable. (You can choice to burn them during mining, or during the chips fabrication, or during the chip design, or any combination of them.)
There could be an ASIC for that. There's a new memory interface that works at 10 TB/sec with 16GB memory[1].Currently there's no cpu/gpu architecture in planning that's even close to needing that memory bandwidth, and the manufacturing is expensive.
All this means this tech won't fit decentralization well.
Bandwidth and latency are two seperate concepts, that 10TB bandwidth is probably around 100MB/second if you need truly random access to individual bytes of 16GB of ram and need to read each byte to find out which byte you need next. It's a speed of light issue where the round trip time back and forth to the individual RAM chips adds an inherent delay so RAM designers don't really focious on reducing latency as your stuck with a farly high latency anyway so bandwidth is generally more useful.
How do you make something incapable of being energy-intensive but still difficult? The more computers have to work at something, the more energy they use. I'm not the world's leading expert, but I don't see a way around that.
There is being energy intensive, and then there is being CPU intensive.
You could imagine a hypothetical coin that somehow uses proof of data storage and retrieval instead of proof of work. Producing and running tons of harddrives would still require power, but it would be a less.. direct.. requirement.
I'm Ripple Labs' Head of Tech Ops and I have to say I think it's amazing. Additionally, there's a program Ripple Labs are running which grants XRP for computing power donated to World Community Grid. So if you want to spend computing power you can, while doing some good. https://www.computingforgood.org/
I wonder if the energy spent moving/securing bitcoin is more or less than the energy we spend moving/securing cash/credit. Driving armored trucks full of cash and coins around isn't free. Nor is it free to keep the regular banking system's electronic infrastructure running.
Even though it is the dead of winter and we just had a major snow storm role though my furnace hasn't turned on in days. My mining equipment provides sufficient heat to keep my house warm on it's own.
Have you tried to calculate whether you make more money from mining than you would save by switching to a more cost effective way of heating your home, without electricity?
I'm not sure I understand your question. My mining equipment is profitable on it's own the fact that it totally eliminates my need to burn natural gas to heat my house is just an extra fringe benefit to the tune of $100 /month in the winter months.
The assumption is that heating with gas/wood/coal whatever is cheaper than heating with electricity - is the profit made from mining including that - your point stands though.
The problem with a lot of cryptocurrencies is that the mining is difficult and wasteful; you're doing a lot of computation that only makes economic sense if a lot of other people are also doing it, and it's plain that many of those people are trying to do it using someone else's equipment, which is a form of theft (specifically, conversion).
I like DogeCoin (much shibe) but this is still a Bad Thing. It would be much better if the problem being solved to create a blockchain were also useful in some other context, eg protein folding. Of course, some such problems are handled better than others by computers; protein folding is something that humans seem to do well but for which we haven't managed to develop great algorithms yet.
But if part of the work can be done in the human brain rather than in a CPU/GPU/ASIC, is that bad? The current computation model basically means that mining rewards go to whoever invests the most capital, ie has money to throw at the problem. This is inherently inequitable, because it means people with more money or a willingness to appropriate the equipment of others will mine/dig/discover more coin. Of course, the argument is that since mining is rather random, in theory you can start mining on your small CPU tomorrow and hit some coin, or get a random amount of coin for a fixed amount of mining. But random rewards devalue expertise and experience in favor of an unnaturally even probability distribution. I really think the computational abstraction is a barrier to cyptocurrency adoption, because ultimately your economic incentive is only as good as people's adoption of a sunk cost fallacy.
Here's a back of the envelope calculation. In theory (by supply-demand idea), you would make a negligible profit mining bitcoins. Each bitcoin is worth about $600. If our supply-demand assumption is true, the $2,160,000 dollars of Bitcoins that come into existence every day, that's also about $2,160,000 of electricity used. Or about 18,000,000 kWh per day. Or about 9,000,000 kg of C02 per day (9000 metric tonnes). That's about 3,285 thousand metric tonnes per year, or about the entire annual CO2 output of a country such as Papua New Guinea.
>If our supply-demand assumption is true, the $2,160,000 dollars of Bitcoins that come into existence every day, that's also about $2,160,000 of electricity used.
I have some first-gen ASIC miners and electricity is 20% of revenues at the current price and difficulty. The current-gen miners that are contributing to most of the increase in hash power are even more efficient and many large scale operations have cheap electricity.
The assertion is that there's roughly enough miners that miners make only a little profit. If your electricity costs $80 and you're likely to make $100, you're going to do it, and the OP is making an assumption that enough people have made that calculation that the amount of money made per miner is close to the cost of electricity per miner.
Therefore, the cost of electricity used should be close to the value of the bitcoins produced.
A new block gets mined every 10 minutes, each with a 25BTC bounty. That's 600 * 25 * 6 * 24 = $2,160,000 per day. No rational person would spend more in electricity than the reward, hence the 1-to-1 conversion. (In fact, miners do spend more than they earn, collectively.)
That's based on the idea that if bitcoin mining is profitable, more people will get in on it until it eventually becomes break even with the cost of doing it.
This is of course a big simplification, and neglects things like the upfront cost of building the mining equipment.
How so? It would seem to me to say nothing on the issue. The added hardware may be a mix of profitable and non-profitable machines. The question is where is the equilibrium now and over time?
I can't offer a strict logical proof, but I very much doubt that the hash rate is doubling month over month due to unprofitable hardware being brought online.
Most of the numbers floating around the internet and nearly every mainstream news piece about bitcoin's energy consumption are wrong. They cite an outdated statistic maintained by blockchain.info that assumed everyone was using a GPU even when nearly all of the network was ASICs.
I would estimate the current network (25Ph/s) averages around 2-5W per Gh/s, leading to 50-125MW total power use. bbosh in this thread is off by an order of magnitude.
I haven't cited any statistic from blockchain.info, and haven't considered hardware at all. All I have done is employ economic argument, which I think makes sense. It doesn't matter whether or not particular individuals are using ASICs. All that matters is that, collectively, they are making only a tiny profit (if any). This argument doesn't require consideration of hash-rate or particular hardware at all.
While a dogecoin miner is almost exactly as efficient an electrical heater as a standard resistive electrical heater, they are both less efficient than heat pumps (or in specific cases, such as heating a single person without having to first heat a very large and cold room, they are both less efficient than radiative electric heaters).
And of course electric heating is typically more expensive than natural gas, fuel oil, coal, municipal steam, etc... If you are turning down the natural gas and turning up the electric heaters, you are burning through money.
The problem isn't individual rigs in private dwellings; it's that entire data centers are being built purely to mine crypto currencies. At that scale heat dissipation itself becomes an energy expending task.
Energy consumption is precisely what gives proof of work its utility. It provides the economic cost to overturning ledger commitments which is at the core of bitcoin's security guarantees.
I would like to see folded coins.. That is, protein folding. I'm definitely not qualified to suggest this, but it seems like proteins are kind of like hash values in their one way process structure.. I wonder if protein foldings are finite in number like bitcoins?
And not a clone of Bitcoin and thus no reason to presume it's not full of bugs and security exploits. Bitcoin is king for a reason, it's survived the scrutiny; any alt coin that chooses to do a complete rewrite is in for a long hard road to come anywhere close to the trust Bitcoin has earned.
When xkcd's April Fools comic ran this year[1] I was doing some Real Work on a few of the engineering research servers at my university. When I noticed the jobs were going much slower than usual I popped open "top". Lo and behold, there were 20 instances of a program named 'xkcd', hashing violently, collectively using 90% of the CPU resources. I checked each of the other machines, and they too had been invaded by 'xkcd' daemons. Knowing that it would be over the next day, and that my job could wait, I figured I wouldn't try making trouble for the (presumably) undergraduate comic enthusiasts, but it sure was annoying. I found out later through reddit that there were at least three independent groups of students running the 'xkcd' hashing script on the servers. (A few had also spun up EC3 instances, so presumably they had a bit more skin in the game).
If nerds will do this for nothing but internet points, it doesn't surprise me that semi-public resources will be exploited when there's a monetary incentive involved.
If cryptocurrencies become mainstream, this will be a bigger problem I think. Even things like "free" electricity are attractive for mining purposes, ex: universities, businesses, etc. If every student setup mining, it adds up (at the university / business' expense).
If I am the one paying for an office at $500 a month that includes all electricity usage, is it fair to plug in a ton of mining hardware and profit / subsidize myself? What if I manage to use more than $500 worth of electricity?
Is shared space even setup for monitoring individual renter's usage behavior? I don't think so.
edit: note, I do mine Doge but with my own stuff at home
There is a new extremely "green" office building in Seattle that does meter/monitor individual electricity plugs. It's uncommon, but not entirely unprecedented.
I've actually caught wind of schemes in-flight exactly like you describe, using mining to transfer money from the business to the owner, essentially off the books.
Is it fair? That's a moral question. Kinda feels like stealing to me. And, anyway, if you were using electricity at a large enough scale to make this interesting, somebody would likely notice.
For me, a good rule of thumb is that if you feel you need to hide what you're doing from your landlord/employer/etc, then you're probably doing something wrong.
but in a large building / campus, they can't possibly monitor electricity usage per individual or company occupying the space without upgrading the infrastructure significantly.
And things like a class of students sitting in a lecture theater with their laptops plugged in, are they mining or gaming or maybe just some hung process is in an infinite loop taking the CPU to 100%?
(I'll admit that I've used a "heater.exe" for warming my hands with a laptop on a cold day...)
When Google was young, they negotiated an extreme power allowance() from they colo facility, and then engineered their systems to maximize their usage, including packing custom hardware that used for more power than standard equipment per rack unit. It is unclear if the salesperson who signed the contract knew what Google was buying.
(and incoming bandwidth allowance -- yay, web crawler)
I'm currently "ignoring" a similar effort in my lab (~20 workstations). I don't think they've broken a MHash/s yet out of an estimated potential ~2-3MH.
This was a young researcher who's access to the cluster was never revoked when they left a few years ago.
Original e-mail Harvard sent last week:
------
Dear all,
I really hate having to send notes like this to our community -
especially one as smart, gifted and talented as you all are, but
anyway here goes...
Yesterday we were alerted to an unfortunate situation by one of our
community members using the cluster who spotted an anomaly
with a set of compute nodes.
Long story short, a "dogecoin" (bitcoin derivative) mining operation
had been set up on the cluster consuming significant resources
in order to participate in a mining contest.
I do want to also quickly state that we do not
inspect, examine or look at algorithms and codes that are executing on
the cluster, we respect your science and assume we are all good
citizens. However, in the course of business, or as happened
yesterday, if we are alerted to unexpected behavior we always
investigate the cause of any issue.
So, to put this simply:
Harvard resources can not be used for any
personal or private gain or any non research related activity.
Accordingly, any participation in "Klondike" style digital mining
operations or contests for profit requiring Harvard owned assets to
examine digital currency key strength and length are strictly
prohibited for fairly obvious reasons. In fact, any activities using
our shared resources for any non scientific purpose that results or
does not actually result in personal gain are also clearly and
explicitly denied.
As a result, and as guidance and as warning to you all, I do need to
say that the individual involved in this particular operation no
longer has access to any and all research computing facilities on a
fully permanent basis.
This kind of use was sorta obvious since the dawn of Bitcoin (and even before; in my high school people were running SETI@HOME screen savers on all school computers to build up ranking points for their team). I'm not sure why so many people act surprised.
Would it really surprise anyone if it was revealed that stuff like this already happened with Bitcoin? Seriously, give a bunch of hackers with access to racks full of Other People's Machines, plus an obvious way to turn cycles into money, and the only surprise would be if they didn't take advantage of the opportunity. I for one expect this to be the norm for every cryptocurrency from now until the end of time. The only question is how much of it the rest of the world is willing to tolerate.
While I think there are many interesting properties of cryptocurrency and the potential applications, I also can't help but think that this mining is just an extreme waste of computing power; compared to other uses like research, application development, hosting a website, even gaming (entertainment), mining (use for heating aside) looks like it's not much more than using computing power for the sake of using computing power, which I find extremely disturbing.
I have heard people say that scrypt is better than SHA256 as a basis for cryptocurrency because it doesn't put all the network power in specialist ASIC boxes, but then stuff like this happens.
Do you think Harvard would have paid for a custom-build scrypt-coin miner if someone wrote a real academic proposal for it?
I don't see why not, custom-built just means grabbing consumer grade gpus, with the "best" one currently the R9 270, which is ~$200 usd each. I don't think you need a huge budget so for academic reasons, why not. Plus, there are many scrypt coins out there (or they could even create harvardcoin) they could use for research purposes.
Don't need to necessarily use the mainstream profitable ones
From my understanding, scrypt ASICs have been held back by the cost of memory hardware as scrypt requires more memory usage. There has been talk of forking existing scrypt currencies to require even more memory to counter ASICs further but so far I think only a few new coins have done this.
Its memory requirement of 128KB is a compromise
between computation-hardness for the prover and verification efficiency for the verifier.
You don't want verification of a proof-of-work to take a lot of resources, since every client has to perform it.
As someone who's used the Odyssey cluster extensively, I can add that the most nodes I've run consecutively is ~ 500. While there are ~ 4000+ nodes on the cluster there's no way any single user could run that many simultaneously.
EDIT: Reversible computing (http://en.wikipedia.org/wiki/Reversible_computing) is a possible way to have computationally difficult proof-of-work while minimizing energy consumption.