In the Paper of Catherine McGeoch and her co-author Cong Wang, they write:
"...As a case in point, our second project compares the V5 hardware chip used in our first study to a V6 chip that became operational after the study was completed. V6 is three to five times faster than V5, and can solve problems as large as n = 502...."
In other words, during the time it took to set up the algorithm and perform the study, Moore's law had been able to enable a classical approach to go five times faster. I am a big supporter of anything that does quantum computing but one should never lose sight of Moore's law.
So, basically, as soon a someone builds one of those optimised for factoring primes, all our encryption methods are finished, as is the Bitcoin economy, right?
Assuming the contents of the article are true (which is possible), how long before that happens? A few years? A decade?
Pretty much: Quantum Computers (QCs) can factor numbers (Shor's algorithm) and calculate discrete logarithms in polynomial time. I don't know whether there's an algorithm for hash-calculation that will let someone dominate the BitCoin hashing chain with a QC though.
On the other hand, no one actually knows whether it's possible to build a quantum computer with enough q-bits that will stay coherent for long enough to carry out the calculation: as you add q-bits, noise becomes more and more of a problem. You can add error correction, but that makes keeping all the q-bits coherent harder because now you've got even more of them! At the moment, no-one knows (unless the NSA has built one and isn't telling!) which effect is going to win out as the number of q-bits increases.
(You can't use a smaller QC to simulate a slower version of a larger one, unlike in the non-quantum computing world: if you need to factor a 1024 bit number and you only have a 1000 bit QC you can't do it, as I understand things.)
Note that Quantum Computers don't help as much with symmetric encryption, unless someone comes up with a much better algorithm. They let you effectively halve the key-space, which is a fairly big deal, but you can get back to the same level of difficulty by doubling up the size of your keyspace: a 256-bit AES key provides roughly the protection of a 128-bit key in a quantum computing world. This is different to public-key encryption, where you can double the key size, but if your opponent has a quantum computer with enough q-bits, they can still break your new key in reasonable time.
As I understand things (and I'm not anything close to being an expert, just an interested outsider with enough of a physics education to be dangerous), you can add q-bits to cope with errors introduced by decoherence, but whether these codes work depends quite finely on the error rate. So if you can manage to keep the error rate low enough, then everything will be fine.
If it turns out that the error rate depends (for physical reasons) on the the size of the system, then for some size of QC, adding error correcting q-bits will be counter-productive. Given that (in public) no-one has made a QC with more than a handful of coherent q-bits, no-one really knows where the limits are: it might well be that error-correction lets you build arbitrary sized coherent QCs, or there might be insurmountable physical limits that prevent that from happening. I look forward to people finding out! It's a fascinating new experimental field of physics.
Not exactly. D-Wave quantum computers (like the one in the article) use "adiabatic quantum computing", which is more limited than normal quantum computing (it can't run every quantum algorithm). We're still a little ways off from breaking RSA.
Indeed; you can't build a quantum CPU with D-Wave's system.
It's conceptually more similar to a GPU which can solve a specific subset of (non-embarrassingly-parallel) problems in polynomial rather than exponential time, with the published number of qubits essentially being the amount of working memory available.
In practice these machines would probably be found optimizing certain steps in machine learning algorithms.
Well, actually not all our encryption methods: there are asymmetric encryption algorithms that provide strong encryption even in the face of quantum computers. Unfortunately, the whole world is using RSA which is vulnerable.
I don't know that much about quantum computing, but I believe it may be possible that D-Wave has managed to figure out how to build a useful quantum computer that is nevertheless not general-purpose enough to execute Shor's algorithm for factoring primes.
When the FBI stops asking for backdoors in Internet services, we'll know they probably don't need them anymore. Then take into account that NSA is already probably 5-10 years ahead of them for any sort of surveillance technology.
Bitcoin is relatively safe; there's Glover's Algorithm [1] which would effectively turn computing sha-256 hashes into computing sha-128 hashes (so I read somewhere) which is a problem, but not an impossible one. The majority of the miners plus core devs can agree to upgrade the protocol to use a different hash algorithm, like scrypt (which Litecoin uses) or something specifically made that's not vulnerable to quantum algorithms. (I don't know if scrypt is.)
This upgrade wouldn't need to happen overnight, either. I bet the first quantum computer that can compute sha-256 hashes at all will do so slower than the latest classical ASICs of the time.
Bitcoin addresses are hashes, not the public keys themselves. Thus they are safe against a quantum factoring algorithm. If you spend money from a bitcoin address though, the public key does get written to the block chain but to combat this you merely have to send your change to a new address and you'll be safe again.
There's a theorist from ETH who also has been working with the D-Wave computers, and visited Scott at MIT last week. There should be a new post about it soon, I'm sure it will be mostly cold water.
I wrote a bunch of words about this [1] when it surfaced 4 days ago, but this NS article contains a bit more info. I'm really excited to see this presentation now.
Yes they've talked themselves up previously but if they're willing to front up with the evidence that they have something going on, I'm willing to give them the benefit of the doubt. After all, there's something to be said when the combined intellect of Google is willing to sink money in to your device.
Quoting her paper, she used the D-Wave computer to solve instances of three NP-Hard problems: Quadratic Unconstrained Binary Optimization (QUBO); Weighed Maximum 2-Satisability (W2SAT), and the Quadratic Assignment Problem (QAP). She then compared the runtime with current software libraries run on Intel Xenons.
Related: When I first heard about quantum computers I thought a bit would be a value of 0 to 1 (or 0V to 5V or something like that). I still am very curious if such a computer could exist and would be useful. For example to do very quick raytracing. Although inaccurate (I assume) it could still be very useful.
I'm a bit confused by this article. Is it comparing computers or comparing algorithms? CPLEX is software to run linear and mixed-integer optimization algorithms. CPLEX is not a computer. Perhaps I misunderstood the article. Could you not run CPLEX on the D-Wave computer?
I doubt that I would be able to buy a D-Wave for 2000$ (or whatever the price of the "high-end desktop computer" they used in the comparison is).
This entire comparison seems just a joke to me.
I keep scratching my head over articles about the D-Wave. Is this ever going to be something we'll see in our homes or are too many qubits required for generalized, popular use?
my thoughts exactly. and i think 'more fair' is an understatement. i'd say this news is meaningless until they compare d-wave to an optimized FPGA solution.
Sure but that's not the point here. The important thing is that this quantum computer actually WORKS. In 2 years classical computers will be maybe 5 times faster. But this quantum computer will be 1000s of times faster (at these specific problems) so very soon any comparison will be irrelevant.
That a custom specialized computing device performed better than a desktop computer doesn't sound particularly compelling to me.
Aren't the dwave machines like super computers? Would you expect them to be that much faster than a desktop?
Sounds weird to me.