I saw a lecture from the author of the first DNA computing paper ( https://en.wikipedia.org/wiki/Leonard_Adleman) in '94. We pulled him into a room after the talk (because he was talking about scaling up the computation significantly) and walked him through the calculations. BEcause the system he designed required a great deal of DNA to randomly bind to other DNA in a highly parallel fashion, you'd need enormous vats of liquid being rapidly stirred.
Like other alternate forms of computing, the systems we build on CPUs today are truly hard to beat, partly because people are trained on those systems, partly becaus the high performance libraries are there, and partly because the vendors got good at make stupid codes run stupid fast.
At this point I cannot see any specific QC that could be used repeatedly for productive work (material simulations, protein design) that would be more useful than a collection of entirely conventional computing (IE, a cluster with 20K CPUs, 10K GPUs, 100PB of storage, and a fast interconnect). Every time I see one of these "BMW is using a quantum computer to optimize logistics" and look more closely, it's obvious it's a PR toy problem, not something giving them a business edge.
Like other alternate forms of computing, the systems we build on CPUs today are truly hard to beat, partly because people are trained on those systems, partly becaus the high performance libraries are there, and partly because the vendors got good at make stupid codes run stupid fast.
At this point I cannot see any specific QC that could be used repeatedly for productive work (material simulations, protein design) that would be more useful than a collection of entirely conventional computing (IE, a cluster with 20K CPUs, 10K GPUs, 100PB of storage, and a fast interconnect). Every time I see one of these "BMW is using a quantum computer to optimize logistics" and look more closely, it's obvious it's a PR toy problem, not something giving them a business edge.