"I'm completely unqualified to hold an opinion, but it seems so intuitive to me that quantum computing is theoretically sound but impossible in practice because error correction will scale exponentially in difficulty."
They have some things that work the difficulty down below exponential, which seems reasonable, but it's still a pretty stiff poly factor (by engineering standards; it's actually pretty impressive mathematically but right now even having to add a small constant factor of qubits would be a kick in the teeth from an engineering perspective, and of course it's worse than that). I'm fairly skeptical myself. However, it's the sort I consider "true skepticism"; it isn't a certainty it will fail rhetorically disguised as uncertainty, I'm really not certain. If someone produces a working, useful QC device, then it is working and useful regardless of my skepticism, and I will celebrate them for all the more for the fact I thought it was really hard.
"Quantum computing, even if it's not all powerful, seems like a "cheat code" for reality, and an inability to practically exploit it seems to me like balancing it out"
IMHO, that's more the hype than the reality. The reality is that we don't have that many QC algorithms in general, and we have even fewer algorithms that are clearly advantageous over classical computing. If we had a perfect QC device with billions of qubits in hand right now and could mass produce them cheaply, at the moment I think the evidence is that you'd be looking at something more like a GPU situation where they get added on to conventional machines for certain problems than a complete revolution. And to be honest a lot of people wouldn't. Graphics cards give you awesome gaming; a QC attachment will solve specific problems that we just don't have everywhere, and using QC to do something a conventional computer can already do is going to be as silly as treating your GPU as a CPU.
The primitives that QC offer us are really weird. It's completely different, but in terms of understanding the difficult, imagine trying to write computation directly in terms of Conway's Game of Life, because we for some reason have some sort of massive accelerator that can run it quadrillions of times faster than we can simulate it otherwise, and trying to get it compute anything useful in "native" Life, e.g., if you do what we would really do in this sort of circumstance and lay a Turing Machine over it you're losing the "advantages" of Life in the process. QC is actually not quite that bad in my opinion, but I think it at least gives the flavor of the difficulties of QC. And also simulates the fact that you can'd do what we usually do in the discrete ___domain and just lay a conventional interpreter over whatever the problem is and work in terms of that interpreter, because you have to have the algorithm "stay quantum" for the QC to be of any use.
They have some things that work the difficulty down below exponential, which seems reasonable, but it's still a pretty stiff poly factor (by engineering standards; it's actually pretty impressive mathematically but right now even having to add a small constant factor of qubits would be a kick in the teeth from an engineering perspective, and of course it's worse than that). I'm fairly skeptical myself. However, it's the sort I consider "true skepticism"; it isn't a certainty it will fail rhetorically disguised as uncertainty, I'm really not certain. If someone produces a working, useful QC device, then it is working and useful regardless of my skepticism, and I will celebrate them for all the more for the fact I thought it was really hard.
"Quantum computing, even if it's not all powerful, seems like a "cheat code" for reality, and an inability to practically exploit it seems to me like balancing it out"
IMHO, that's more the hype than the reality. The reality is that we don't have that many QC algorithms in general, and we have even fewer algorithms that are clearly advantageous over classical computing. If we had a perfect QC device with billions of qubits in hand right now and could mass produce them cheaply, at the moment I think the evidence is that you'd be looking at something more like a GPU situation where they get added on to conventional machines for certain problems than a complete revolution. And to be honest a lot of people wouldn't. Graphics cards give you awesome gaming; a QC attachment will solve specific problems that we just don't have everywhere, and using QC to do something a conventional computer can already do is going to be as silly as treating your GPU as a CPU.
The primitives that QC offer us are really weird. It's completely different, but in terms of understanding the difficult, imagine trying to write computation directly in terms of Conway's Game of Life, because we for some reason have some sort of massive accelerator that can run it quadrillions of times faster than we can simulate it otherwise, and trying to get it compute anything useful in "native" Life, e.g., if you do what we would really do in this sort of circumstance and lay a Turing Machine over it you're losing the "advantages" of Life in the process. QC is actually not quite that bad in my opinion, but I think it at least gives the flavor of the difficulties of QC. And also simulates the fact that you can'd do what we usually do in the discrete ___domain and just lay a conventional interpreter over whatever the problem is and work in terms of that interpreter, because you have to have the algorithm "stay quantum" for the QC to be of any use.