Hacker News new | past | comments | ask | show | jobs | submit login

When vacuum tubes were only reliable until they burned out (which they did quite often [1], and sometimes in ways that were difficult to detect directly), they probably seemed quite useless for the amount being invested in them. Calculations had to be repeated often enough to rule out circuit errors, but repeated use is also what caused them to fail. The Selectron tube hardly became mature enough to be integrated (it had to be scaled down, and its intended use case ended up going with the competing Williams-Kilburn tube). Researchers could have abandoned tube designs for delay-line memory, or even called it quits and gone back to the abacus and slide rule.

Decades before that, we had a solution for holding a few hundreds of bits (up to low thousands of bits) in drum memory [2]. This depended a lot on the quality and evenness of the ferromagnetic material, and on reads being timed correctly. Researchers could have called it impractical and gone back to pencil and paper, there weren't even really computing devices to use them yet.

Sometimes it's good to keep persisting at a problem space despite the impracticality of the current tech. I'm sure you can point to tech that continued to be impractical, but it doesn't mean that will always be so.

[1] https://historyofinformation.com/detail.php?id=692

[2] https://web.archive.org/web/20220414065240/http://cs-exhibit...




What you say is true, but apparently, in the case of quantum computers there seem to be some fundamental issues. I remember reading two-three years ago an argument by a professor of Mathematics against the feasibility of having quantum computers able to break classic public key cryptography (so, admittedly, not against quantum computers in general but only against a specific, but very important, use case). Unfortunately, I already tried finding it again on the Internet but couldn't.

What I vaguely remember among the points he was bringing to the table, was an argument about the amount of shielding you need to have the required amount of qbits to break current RSA to remain coherent long enough for the factorization to complete. He somehow (I don't remember how) showed that this translated to having a result with precision up to a certain decimal place. Meanwhile, the most accurate laws of physics that we know of have a precision that stop a couple of decimal digits before that. Therefore, in his argument, having a quantum computer that is able to break RSA (I don't remember if it was for 2048 or 4096 bit keys) would enatil improving our understanding of physics by two orders of magnitude.

Now, this doesn't mean that this is impossible, and new error correction techniques can change the situation. But, this is to say that here we are at a very different level than having a better version of vaccum tubes or drums.

Personally, I don't say that we will never ever have quantum computers. But I think that some skepticism is granted. Expecially when we hear claims that those are around the corner.


These kind of arguments - that physics will somehow magically forbid us from scaling up quantum computers - are exactly why Google does these experiments on larger and larger chips. If the skeptics are right, we expect their systems to reach some kind of wall where the machine does not behave as anticipated by quantum computing optimists. So far we have not hit any such wall, the machine behaves exactly the way it is supposed to (with predictable error rates etc) and the Google results are on the far side of various thresholds that were believed challenging or impossible by the more skeptical end of the community.


> physics will somehow magically forbid us from scaling up quantum computers

Not magically

>with predictable error rates etc

This isn't my field of expertise, but my understanding is that this is precisely the issue: these predictable error rates are predicted to be a deal-breaker for the amount of qbits that are needed to break RSA with today's commonly-used key lengths. In my view, the fact that the error rates appear to match predictions so far, only sustain this point. A reason to be optimistic could be that sometimes we get to hear about new promising error-correction techniques though.

We'll see how future development goes, in the end it is hard to predict what will happen. But this is also why I think we should be more careful in claiming that quantum computer will break classical cryptography in the short to medium term.


>So far we have not hit any such wall

On the contrary. The wall in question is "can the computer compute the result we need correctly and efficiently?" So far no QC has climbed over that wall.


All of those systems, despite being based on unreliable hardware, were still able to compute useful results that could be checked with reality. Range tables, mostly.


When they worked, and only for very small scale calculations.

I bring up this specific era because there was as much skepticism about electronic computers then as I think there is now about quantum computing. They were big and slow even at the small applicable scale, and incredibly unreliable. We had radio; it was comparatively gangbusters. We had telegraph, which also failed a lot but at least was obviously useful. For all the hubbub about computers computing, we knew humans were extraordinarily more capable and had tools like the slide rule to handle calculations at varying scale. I think you'd have a hard time being in computer sales then, even if you knew what today looked like.

Not that I am all that confident about quantum panning out, I'm still in a superposition about it, let's say. But it was what I felt to be an appropriate comment to the root-comment's sentiment "quantum computers don't exist."

Good point about being able to check the answers against reality. It is a big sticking point about anything QED related.

EDIT: oh, nevermind, it appears that comment has since been removed.


In general it's probably wise to be skeptical of new technology. However, I would imagine back then people would have been skeptical because computers were annoying to deal with and of questionable utility all things considered, not because they weren't sure if the computers were even doing anything at all. Imagine having a 20 ton machine that takes up half a room and gobbles up power and people's time, and all it's able to do is print out numbers that relate to nothing other than its internal state.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: