They're not easily comparable since quantum complexity refers to the upper bound on queries to some black-box "quantum oracle", which exists on the gate-level. A classical parallel would perhaps be queries to a translation lookaside buffer, which can be regarded as a gate-level black-box (I'm sure people who actually work in hardware already smell blood!).
By contrast, classical complexity, as in sorting algorithms, is reasoned about in higher-level programming languages, whose operational complexity is hard to describe down to the bit-level.
I'm curious if anyone more knowledgeable could argue one-way-or-another if this is a boon to the quantum-computers-will-probably-be-faster-in-realization camp.
I am also relatively new to this, but I believe there is an apples-to-apples comparison with the number of gate operations required to solve an instance of a problem.
It's an approximation algorithm. It's not about the runtime. It's about the worst or average case approximation quality while remaining in a polynomial budget.
The 'general' TSP (unless P=NP) does not admit a worst-case constant-factor approximation, at least not on classical machines (we're still working on the quantum PCP theorem).
So while I'm not immediately familiar with the approximation algorithm you're suggesting (unless it is Christofides[1]), it seems unlikely that it would produce a good approximation for the TSP. It might still perform well in the average-case over random instances. I'll admit I haven't delved deeply into that about the TSP, but I don't believe there are currently any favorable results. There are certainly no Overlap Gap Property related results about the TSP. If I wanted to prove something in the average case about the TSP, I'd definitely be starting there.
[1]: If you were thinking about Christofides: Christofides is indeed a 3/2-approximation, but not for the general TSP. Instead, it approximates a problem like the TSP, known as the Metric TSP. The Metric TSP introduces additional symmetries that make it much, much easier to approximate. As such, Christofides is an approximation algorithm for an approximate version of the problem, which I think is pretty neat. If I'm remembering correctly, the inapproximability results for the Metric TSP are rather favorable.
In grad school, wrote a paper as a homework assignment for a course where reviewed a paper on the idea of using a minimium spanning tree for the traveling salesman problem. The paper did have some math addressing how close the idea was to optimality. That was a long time ago, and now don't have the reference or the homework! From a fast search, now there is
I think its more like trying to find the highest mountain in a range of mountains, except there's too much fog so you can't see how high other mountains are compared to the one you're on, so you can't tell if you're on the highest mountain until you've climbed them all.
A lot of the non-quantum heuristical approaches leave you isolated on one of a few mountains (local maxima), because simply climbing back down and hoping to find another peak is computationally expensive and you know it will lead to a lot of suboptimal solutions before finding something hopefully comparable or even better than your highest peak reached so far.
The issue is that, apart from the Shor’s algorithm, it is hard to find a quantum algorithm, substantially faster than a classical one.
Moreover, since cost of qubits scale exponentially, a slight reduction in the power of a polynomial-time algorithm is not enough to offer a practical benefit.
I.e., that for n logical qubits, decoherence time goes down polynomially rather than e^(-a n)? AFAIK (and yes, I could be mistaken!) what we deal with is reducing a susbtantially.
“Substantially better” is doing the heavy lifting there. Grover’s gets you a sqrt speedup. Double the size of your hash function (SHA512 instead of SHA256) and you’re protected.
Because there is a different sqrt speedup concern regarding the birthday paradox / collision resistance. These combine with Grover's to get you a O(N^(1/3)) speedup when finding a collision, making a 256-bit hash have ~85 bits of security against a quantum adversary.
You'd need to switch to a 384 bit hash function to keep a 128-bit security threshold. In practice, this means SHA-512.
Factoring hasn't been shown to be NP-complete. It maybe the case that it's in P (and we just don't have an algo yet) or it maybe the case that it is an NP intermediate problem.
I think it's clear that factoring works if you can build a scalable quantum computer. But:
1. It's a very lonely example.
2. It's not clear how big the market is for "you can break encryption from the past, but not current encryption, because once QCs arrive, everyone will obviously move away from QC-breakable encryption". And I'm not aware of any other reasons why factoring large numbers would be useful beyond breaking encryptions.
I think the goal here is to find more useful algorithms that benefit from quantum computing. Being able to factor primes efficiently is only useful for the sake of the cryptographers' job security :-)
Are you talking about the skepticism from Kalai or more general skepticism (from wider group people, which is more about its application to optimization problem, not prime factoring)?
I’m exaggerating a lot here, but one concern people have with quantum computing is what if it breaks all cryptography, but doesn’t help us solve “useful” problems.
The what if is pulling a lot of weight here but I hope it makes my point.
That statement alone does not paint the whole picture. It also has to be better in terms of some metric (e.g. accuracy, computation time, size of molecules simulatable) when compared to classical ones.
It makes accurate simulation of quantum mechanical systems of relevance (e.g. molecules) even possible. DFT simulations are full of approximations just to get them to be able to run on classical supercomputers. Nobody does a full solution of Schrödinger's equation. With a sufficiently sized quantum computer, you could.
I guess that would be about asymmetrical encryption. Which is today indispensable for a lot of infrastructure. Still, there's symmetrical encryption, which is mostly untouched by Shor's algorithm.
QKD also needs key distribution to prevent MITM attacks. It behaves almost exactly like a classical stream cypher, except that it cannot be package-routed, is many orders of magnitude slower, and is prone to side-channel attacks.