The problem with a lot of cryptocurrencies is that the mining is difficult and wasteful; you're doing a lot of computation that only makes economic sense if a lot of other people are also doing it, and it's plain that many of those people are trying to do it using someone else's equipment, which is a form of theft (specifically, conversion).
I like DogeCoin (much shibe) but this is still a Bad Thing. It would be much better if the problem being solved to create a blockchain were also useful in some other context, eg protein folding. Of course, some such problems are handled better than others by computers; protein folding is something that humans seem to do well but for which we haven't managed to develop great algorithms yet.
But if part of the work can be done in the human brain rather than in a CPU/GPU/ASIC, is that bad? The current computation model basically means that mining rewards go to whoever invests the most capital, ie has money to throw at the problem. This is inherently inequitable, because it means people with more money or a willingness to appropriate the equipment of others will mine/dig/discover more coin. Of course, the argument is that since mining is rather random, in theory you can start mining on your small CPU tomorrow and hit some coin, or get a random amount of coin for a fixed amount of mining. But random rewards devalue expertise and experience in favor of an unnaturally even probability distribution. I really think the computational abstraction is a barrier to cyptocurrency adoption, because ultimately your economic incentive is only as good as people's adoption of a sunk cost fallacy.
Here's a back of the envelope calculation. In theory (by supply-demand idea), you would make a negligible profit mining bitcoins. Each bitcoin is worth about $600. If our supply-demand assumption is true, the $2,160,000 dollars of Bitcoins that come into existence every day, that's also about $2,160,000 of electricity used. Or about 18,000,000 kWh per day. Or about 9,000,000 kg of C02 per day (9000 metric tonnes). That's about 3,285 thousand metric tonnes per year, or about the entire annual CO2 output of a country such as Papua New Guinea.
>If our supply-demand assumption is true, the $2,160,000 dollars of Bitcoins that come into existence every day, that's also about $2,160,000 of electricity used.
I have some first-gen ASIC miners and electricity is 20% of revenues at the current price and difficulty. The current-gen miners that are contributing to most of the increase in hash power are even more efficient and many large scale operations have cheap electricity.
The assertion is that there's roughly enough miners that miners make only a little profit. If your electricity costs $80 and you're likely to make $100, you're going to do it, and the OP is making an assumption that enough people have made that calculation that the amount of money made per miner is close to the cost of electricity per miner.
Therefore, the cost of electricity used should be close to the value of the bitcoins produced.
A new block gets mined every 10 minutes, each with a 25BTC bounty. That's 600 * 25 * 6 * 24 = $2,160,000 per day. No rational person would spend more in electricity than the reward, hence the 1-to-1 conversion. (In fact, miners do spend more than they earn, collectively.)
That's based on the idea that if bitcoin mining is profitable, more people will get in on it until it eventually becomes break even with the cost of doing it.
This is of course a big simplification, and neglects things like the upfront cost of building the mining equipment.
How so? It would seem to me to say nothing on the issue. The added hardware may be a mix of profitable and non-profitable machines. The question is where is the equilibrium now and over time?
I can't offer a strict logical proof, but I very much doubt that the hash rate is doubling month over month due to unprofitable hardware being brought online.
Most of the numbers floating around the internet and nearly every mainstream news piece about bitcoin's energy consumption are wrong. They cite an outdated statistic maintained by blockchain.info that assumed everyone was using a GPU even when nearly all of the network was ASICs.
I would estimate the current network (25Ph/s) averages around 2-5W per Gh/s, leading to 50-125MW total power use. bbosh in this thread is off by an order of magnitude.
I haven't cited any statistic from blockchain.info, and haven't considered hardware at all. All I have done is employ economic argument, which I think makes sense. It doesn't matter whether or not particular individuals are using ASICs. All that matters is that, collectively, they are making only a tiny profit (if any). This argument doesn't require consideration of hash-rate or particular hardware at all.
Mining is supposed to be difficult.