Hacker News new | past | comments | ask | show | jobs | submit login
Quantum Speedup Found for Class of Hard Problems (quantamagazine.org)
74 points by EA-3167 50 days ago | hide | past | favorite | 33 comments



I really wish this article said what the big O runtime for the classical and quantum algorithms are


They're not easily comparable since quantum complexity refers to the upper bound on queries to some black-box "quantum oracle", which exists on the gate-level. A classical parallel would perhaps be queries to a translation lookaside buffer, which can be regarded as a gate-level black-box (I'm sure people who actually work in hardware already smell blood!).

By contrast, classical complexity, as in sorting algorithms, is reasoned about in higher-level programming languages, whose operational complexity is hard to describe down to the bit-level.

I'm curious if anyone more knowledgeable could argue one-way-or-another if this is a boon to the quantum-computers-will-probably-be-faster-in-realization camp.


I am also relatively new to this, but I believe there is an apples-to-apples comparison with the number of gate operations required to solve an instance of a problem.


It's an approximation algorithm. It's not about the runtime. It's about the worst or average case approximation quality while remaining in a polynomial budget.


> It's an approximation algorithm.

Clear. Nice. Thanks.

In that case, for the Traveling Salesman problem, build a minimum spanning tree and traverse that -- old result.


The 'general' TSP (unless P=NP) does not admit a worst-case constant-factor approximation, at least not on classical machines (we're still working on the quantum PCP theorem).

So while I'm not immediately familiar with the approximation algorithm you're suggesting (unless it is Christofides[1]), it seems unlikely that it would produce a good approximation for the TSP. It might still perform well in the average-case over random instances. I'll admit I haven't delved deeply into that about the TSP, but I don't believe there are currently any favorable results. There are certainly no Overlap Gap Property related results about the TSP. If I wanted to prove something in the average case about the TSP, I'd definitely be starting there.

[1]: If you were thinking about Christofides: Christofides is indeed a 3/2-approximation, but not for the general TSP. Instead, it approximates a problem like the TSP, known as the Metric TSP. The Metric TSP introduces additional symmetries that make it much, much easier to approximate. As such, Christofides is an approximation algorithm for an approximate version of the problem, which I think is pretty neat. If I'm remembering correctly, the inapproximability results for the Metric TSP are rather favorable.


In grad school, wrote a paper as a homework assignment for a course where reviewed a paper on the idea of using a minimium spanning tree for the traveling salesman problem. The paper did have some math addressing how close the idea was to optimality. That was a long time ago, and now don't have the reference or the homework! From a fast search, now there is

https://www.geeksforgeeks.org/approximate-solution-for-trave...

https://en.wikipedia.org/wiki/Minimum_spanning_tree

Am reminded that the nodes to be visited must have distances that obey the triangle inequality, e.g., like a plane.

Then when traversing the tree, when there is no arc in the tree to the next node, just leave the tree and go direct.


I think its more like trying to find the highest mountain in a range of mountains, except there's too much fog so you can't see how high other mountains are compared to the one you're on, so you can't tell if you're on the highest mountain until you've climbed them all.

A lot of the non-quantum heuristical approaches leave you isolated on one of a few mountains (local maxima), because simply climbing back down and hoping to find another peak is computationally expensive and you know it will lead to a lot of suboptimal solutions before finding something hopefully comparable or even better than your highest peak reached so far.


If I’m reading it right:

    dqi = O(m^2)
    classical = O(2^n)

    m = variables
    n = constraints



Is prime factoring (e.g. Shor's algorithm) not a clear example of where quantum computers would clearly outperform classical ones? Why the skepticism?


The issue is that, apart from the Shor’s algorithm, it is hard to find a quantum algorithm, substantially faster than a classical one.

Moreover, since cost of qubits scale exponentially, a slight reduction in the power of a polynomial-time algorithm is not enough to offer a practical benefit.


Ah, walking across glued trees [1]! Not sure of its practical merits however, aha...

[1] - https://arxiv.org/abs/quant-ph/0209131


Cost of qubits does not scale exponentially if we have quantum error correction.


Do you know a proof for that?

I.e., that for n logical qubits, decoherence time goes down polynomially rather than e^(-a n)? AFAIK (and yes, I could be mistaken!) what we deal with is reducing a susbtantially.


Grover's?


“Substantially better” is doing the heavy lifting there. Grover’s gets you a sqrt speedup. Double the size of your hash function (SHA512 instead of SHA256) and you’re protected.


I believe 256 bit symmetric keys and hashes are currently recommended because it's double the classically-secure 128 bit.


Because there is a different sqrt speedup concern regarding the birthday paradox / collision resistance. These combine with Grover's to get you a O(N^(1/3)) speedup when finding a collision, making a 256-bit hash have ~85 bits of security against a quantum adversary.

You'd need to switch to a 384 bit hash function to keep a 128-bit security threshold. In practice, this means SHA-512.


Factoring hasn't been shown to be NP-complete. It maybe the case that it's in P (and we just don't have an algo yet) or it maybe the case that it is an NP intermediate problem.


I think it's clear that factoring works if you can build a scalable quantum computer. But:

1. It's a very lonely example.

2. It's not clear how big the market is for "you can break encryption from the past, but not current encryption, because once QCs arrive, everyone will obviously move away from QC-breakable encryption". And I'm not aware of any other reasons why factoring large numbers would be useful beyond breaking encryptions.


I think the goal here is to find more useful algorithms that benefit from quantum computing. Being able to factor primes efficiently is only useful for the sake of the cryptographers' job security :-)


Are you talking about the skepticism from Kalai or more general skepticism (from wider group people, which is more about its application to optimization problem, not prime factoring)?


I’m exaggerating a lot here, but one concern people have with quantum computing is what if it breaks all cryptography, but doesn’t help us solve “useful” problems.

The what if is pulling a lot of weight here but I hope it makes my point.


Quantum computing is extremely useful for accurate computational chemistry, which impacts everything from materials science to medicine.


That statement alone does not paint the whole picture. It also has to be better in terms of some metric (e.g. accuracy, computation time, size of molecules simulatable) when compared to classical ones.


It makes accurate simulation of quantum mechanical systems of relevance (e.g. molecules) even possible. DFT simulations are full of approximations just to get them to be able to run on classical supercomputers. Nobody does a full solution of Schrödinger's equation. With a sufficiently sized quantum computer, you could.


I guess that would be about asymmetrical encryption. Which is today indispensable for a lot of infrastructure. Still, there's symmetrical encryption, which is mostly untouched by Shor's algorithm.


Symmetrical encryption requires key distribution, which presents new areas of attack.

Though there is also quantum key distribution, which will probably be solved as soon as quantum algorithms can break classical encryption.


QKD also needs key distribution to prevent MITM attacks. It behaves almost exactly like a classical stream cypher, except that it cannot be package-routed, is many orders of magnitude slower, and is prone to side-channel attacks.


QKD is essentially a layer 1 protocol. It's not in the same realm as symmetric key establishment.

That said, there are quantum-resistant key establishment algorithms out there, and more likely one of them will be selected.


Depending on the use case. Symmetric crypto varies. For example, if your source only has 100 bits of classical entropy. For example, fuzzy extractors.

It becomes manageable to break it now with a QC.


OT: Is there an accessible ground up presentations on Quantum so that I can self up skill?

I’m thinking of something like the equivalent Zero to Hero by Andrej Karpathy for AI.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: