Quantum Computing Gets a 'Hard, Cold Reality Check' (ieee.org) 67
A Canadian cybersecurity firm has warned that as soon as 2025, quantum computers could make current encryption methods useless.
But now Slashdot reader christoban shares a "reality check" — an IEEE Spectrum takedown with the tagline "Hype is everywhere, skeptics say, and practical applications are still far away." The quantum computer revolution may be further off and more limited than many have been led to believe. That's the message coming from a small but vocal set of prominent skeptics in and around the emerging quantum computing industry... [T]here's growing pushback against what many see as unrealistic expectations for the technology. Meta's head of AI research Yann LeCun recently made headlines after pouring cold water on the prospect of quantum computers making a meaningful contribution in the near future.
Speaking at a media event celebrating the 10-year anniversary of Meta's Fundamental AI Research team he said the technology is "a fascinating scientific topic," but that he was less convinced of "the possibility of actually fabricating quantum computers that are actually useful." While LeCun is not an expert in quantum computing, leading figures in the field are also sounding a note of caution. Oskar Painter, head of quantum hardware for Amazon Web Services, says there is a "tremendous amount of hype" in the industry at the minute and "it can be difficult to filter the optimistic from the completely unrealistic."
A fundamental challenge for today's quantum computers is that they are very prone to errors. Some have suggested that these so-called "noisy intermediate-scale quantum" (NISQ) processors could still be put to useful work. But Painter says there's growing recognition that this is unlikely and quantum error-correction schemes will be key to achieving practical quantum computers. The leading proposal involves spreading information over many physical qubits to create "logical qubits" that are more robust, but this could require as many as 1,000 physical qubits for each logical one. Some have suggested that quantum error correction could even be fundamentally impossible, though that is not a mainstream view. Either way, realizing these schemes at the scale and speeds required remains a distant goal, Painter says... "I would estimate at least a decade out," he says.
A Microsoft technical fellow believes there's fewer applications where quantum computers can really provide a meaningful advantage, since operating a qubit its magnitudes slower than simply flipping a transistor, which also makes the throughput rate for data thousands or even millions of times slowers.
"We found out over the last 10 years that many things that people have proposed don't work," he says. "And then we found some very simple reasons for that."
But now Slashdot reader christoban shares a "reality check" — an IEEE Spectrum takedown with the tagline "Hype is everywhere, skeptics say, and practical applications are still far away." The quantum computer revolution may be further off and more limited than many have been led to believe. That's the message coming from a small but vocal set of prominent skeptics in and around the emerging quantum computing industry... [T]here's growing pushback against what many see as unrealistic expectations for the technology. Meta's head of AI research Yann LeCun recently made headlines after pouring cold water on the prospect of quantum computers making a meaningful contribution in the near future.
Speaking at a media event celebrating the 10-year anniversary of Meta's Fundamental AI Research team he said the technology is "a fascinating scientific topic," but that he was less convinced of "the possibility of actually fabricating quantum computers that are actually useful." While LeCun is not an expert in quantum computing, leading figures in the field are also sounding a note of caution. Oskar Painter, head of quantum hardware for Amazon Web Services, says there is a "tremendous amount of hype" in the industry at the minute and "it can be difficult to filter the optimistic from the completely unrealistic."
A fundamental challenge for today's quantum computers is that they are very prone to errors. Some have suggested that these so-called "noisy intermediate-scale quantum" (NISQ) processors could still be put to useful work. But Painter says there's growing recognition that this is unlikely and quantum error-correction schemes will be key to achieving practical quantum computers. The leading proposal involves spreading information over many physical qubits to create "logical qubits" that are more robust, but this could require as many as 1,000 physical qubits for each logical one. Some have suggested that quantum error correction could even be fundamentally impossible, though that is not a mainstream view. Either way, realizing these schemes at the scale and speeds required remains a distant goal, Painter says... "I would estimate at least a decade out," he says.
A Microsoft technical fellow believes there's fewer applications where quantum computers can really provide a meaningful advantage, since operating a qubit its magnitudes slower than simply flipping a transistor, which also makes the throughput rate for data thousands or even millions of times slowers.
"We found out over the last 10 years that many things that people have proposed don't work," he says. "And then we found some very simple reasons for that."
Unbiased Opinion (Score:5, Interesting)
While I think most will agree one day quantum computing will be able to quickly crack the encryption of today, I think the likelihood of it occurring in 2025 is extremely low. By the time it actually happens somewhere in the 2030s or later, the internet will have transitioned to different (quantum-resistant) encryption algorithms just like we retired RC5, DES, and SSL (and TLS v1.3+). We do not need to pay this CEO's company in the meantime.
Re: Unbiased Opinion (Score:4, Insightful)
My understanding is quantum computers aren't even going to change anything other than enabling some algorithms that Turing computers can't perform directly. In this case, the big deal is Shor's algorithm for factoring large polynomials.
Re: (Score:3)
Your understanding is correct. For most algorithms, QCs are completely inefficient and would be dog-slow compared to anything classical.
As to Shor's algorithm, that is one of the few examples of practical applicable algorithms that may or may not be within reach. But keep in mind that Shor's requires a long an complex calculation, which will drive error-correction effort up dramatically. It also requires a lot of effective (error-corrected) QBits. For example, breaking RSA-4096 requires something like 16k e
Re: (Score:3)
You have too short a time frame to extrapolate the rate of change. Nothing meaningful was being done with QC in 1970 or even 1975. Digital computers were not even common then. Also, you aren't considering potential technological improvements. One of the big issues today is cryogenic requirements. Change that and things will alter rapidly.
The only thing to agree about is that 2025 is ludicrous. The CEO who said that should have to wear a sign around his neck with that statement on it for the rest of hi
Re: (Score:2)
"One of the big issues today is cryogenic requirements. Change that and things will alter rapidly."
I don't have a pink unicorn. Change that and I'll be rich.
Re: (Score:2)
One of the big issues today is cryogenic requirements. Change that and things will alter rapidly.
Sure. Change basics Physics and QCs will run just fine! Obviously, everything becomes easy to do if you ignore reality ...
Re: Unbiased Opinion (Score:3)
It sounds like we need more qubits for it to even compete with classical computers, and the more qubits you add, the error rate increases exponentially. So we're just waiting for a breakthrough to either reduce the error rate or make it less impactful, which could come a year from now, a hundred years from now, or it may never come and we need to use increasing amounts of energy to get closer and closer to absolute zero to minimize entropy. And if that's the case, you almost may as well not even bother, bec
Re: Unbiased Opinion (Score:2)
There was tons of stuff being done in the 70s with QC, there just wasnâ(TM)t any practical application since there was no quantum computer. There was tons of stuff being done on classical computing in the 1850s (Charles Babbage and Ada Lovelace being a prime example) well over 100 years before the invention of the microchip that made it practical to build an affordable system.
We are barking up the same tree with QC right now, weâ(TM)re probably going down the wrong path like Babbage, but it has to
Re: Unbiased Opinion (Score:3)
Re: (Score:2)
That's a very good question. The answer is long and complicated though, and you might already know some of the answer, so I'm going to summarize it here rather than going into all the details. Feel free to search each term if you want more depth.
Quantum computers only work inasmuch as their Qbits follow the Schrodinger equation. The Schrodinger equation is a linear, unitary, partial differential equation. Any equation that's unitary can be run forward or backward in time without losing information - you
Re: (Score:2)
Re: (Score:2)
As I said, one day quantum will break non-quantum-resistant encryption algorithms, it's just not going to happen in 2025.
Re: (Score:2)
The problem isn't theoretical. You say yourself, "it's many years away." That makes it an engineering problem, and the questions are how many years, and how expensive.
If you really want to keep something secret for the next twenty years, ten years, or, just possibly two years, because who knows what the NSA is doing, you should probably use SHA256. If it's anything less than that and you haven't pissed off the US, elliptic curves are almost certainly fine.
Re: Unbiased Opinion (Score:2)
Re: (Score:2)
The SHA-2 family, of which SHA-256 is one member, is not superseded by the SHA-3 family. Both are recognized -- by US NIST and other government and standards bodies -- as secure against currently known and reasonably foreseen threats. (NIST even still permits use of SHA-1 in some situations, but will remove that from the list of permitted cryptographic hash algorithms in the relatively near future.) The US government recommends that anyone using any of these standards make plans to support SHA-3 in addit
Re: (Score:3)
Re: (Score:2)
It's not just RSA; just about every algorithm currently widely used is not quantum safe (and that's widely known). However, for that to matter requires actual working and useful quantum computers, which so far appear to be like fusion power, perpetually 20 years away.
But a CEO of a company warning they're a year away instead, so you better buy their product NOW, is pure marketing.
Re: (Score:2)
No, it's basically RSA and elliptic curve. Asymmetric algorithms only. AES is fine, although you might want to stop using short keys that aren't recommended anyway.
Re: (Score:2)
No, they're saying it won't happen tomorrow and probably not in 5 or 10 years. Possibly never.
The cryptographers aren't wrong to warn that it COULD happen at some point. The two statements aren't even logically incompatible.
Conclusion: It's not a bad idea to move calmly over to less vulnerable public key systems. But don't lift the Molly guard on the panic button.
Relative Encryption (Score:1)
Encryption used when I first got on the internet in 1994/5 can be casually broken by your average smart phone in seconds. Encryption used on the internet in 2005 can be broken by your average desktop in minutes. Encryption used on the internet in 2015 can be broken by a home crypto mining rig in less than an hour. The idea that the encryption we're using at the end of 2023 could be broken trivially in 2035 doesn't require quantum computing. It's just normal. This is more a product of a rather human inabilit
Re: (Score:2)
"This is more a product of a rather human inability to properly conceptualize exponential growth curves rather than an existential problem."
We can do the math, we just did it in favor of a cheaper onboard encryption engine. Why spend 2x as much to protect against something that may happen 2x further in the future than the useful lifespan of the computing device?
Re: (Score:2)
Your numbers are flawed. There is not enough data points to infer the trend you are predicting and looking at actual details makes your argument entirely ludicrous.
Re: Relative Encryption (Score:2)
Well, relevant to this discussion is the asymmetric key exchange. You are technically right about mid 90s, because they deliberately limited for "export controls".
However, everything else is a bit overly pessimistic. 1024 bit is only recently reasonably susceptible to obtainable horsepower.
Symmetric algorithms, similar deal, but a moot point for hypothetical quantum. AES was late 90s and as a cipher is still strong.
Hashing has had some headaches, but SHA-2 was in 2001 and it's still considered fit for pu
Re: (Score:2)
Wow - never heard of that (Score:1)
https://it.slashdot.org/story/... [slashdot.org]
Re: (Score:1)
I watched it... "We can't do anything right now, but boy there are sure some real opportunities!"
so what is the largest number ..... (Score:1)
that has been factored?
Isn't it 21? And they had to tell the computer that 3 was one of the factors? Yeah, any day now.
I just keep posting that from my history.....
Re: (Score:2, Troll)
Indeed. Much like some other research communities (AI, I am looking at you), the QC community is largely operating with fakes, lies and baseless claims when communicating with the public. They all know their stuff is theoretical fundamental research and no practical application is anywhere in sight. They all know that if they admit that, their funding will dry up. So they lie.
Re: so what is the largest number ..... (Score:3)
AI at least has very accessible and compelling demonstrations. There are struggles around where it will and will not be relevant, and lots of grift in the industry around that uncertainty. However there are a lot of "AI" methodologies with very interesting applications, even if I'm not crazy about the name and expectations are being managed poorly.
Re: (Score:2)
Sure. AI has yet another broken and very limited mechanism that gets hyped all out of proportion. QCs do still not exist and all "demonstrations" are essentially fakes that miss most elements of a working QC.
Does not change the fact that most publicly visible actors on the proponent side in both spaces are blatantly lying about the actual state of affairs.
John Preskill is an actual expert (Score:2)
Re: John Preskill is an actual expert (Score:2)
Re: (Score:1)
Maybe this will come through - https://www.youtube.com/watch?... [youtube.com]
Error correction will not do it (Score:3)
It starts with quantum error correction not being perfect. You still get decoherence in longer or more complex calculations. Then, complexity of error correction adds massively to the complexity of the machine, because suddenly you need to have a lot more QBits and they still need to _all_ get entangled. That overhead for error correction gets worse both in the size of the machine and in the length of the computation.
Sorry, scalability is just not there and that is a _fundamental_ problem that cannot be solved. Classical computing only ever scaled up because classical computations can be subdivided to individual bit operations. These days even classical computing is running into scalability problems for a lot of problems, bit that is a different discussion. Anyways, it looks very much like QCs scale worse much than linear, probably inverse-exponential. That very likely means they will never even get where current conventional computers are.
No idea why people keep ignoring reality here, but the current AI hype, for example, is a nice indicator that many people are just not very smart and have no clue what they are talking about.
Re: (Score:2)
"No idea why people keep ignoring reality here, but the current AI hype, for example, is a nice indicator that many people are just not very smart and have no clue what they are talking about."
Same reason for every other religion. Its easier to Believe (TM) than to know complex concepts. And religion has inertia of mass appeal, where knowledge has friction of provability.
Re: (Score:2)
Same reason for every other religion. Its easier to Believe (TM) than to know complex concepts. And religion has inertia of mass appeal, where knowledge has friction of provability.
Yes, probably. But the sheer ignorance and bright-eyed willingness to believe does still astonish me in this day and age where knowledge is actually available to everybody. It seems a lot of people just do not want it and prefer their made-up surrogate reality.
Re: (Score:2)
Your objections are correct, but not sufficient. Yes, there are problems that really need to be solved before scaling can be done, but if they are solved, then the scaling can be done extremely quickly.
OTOH, I do have some questions about whether the problems CAN be solved. Perhaps. The ideas I like best involve nitrogen spin states, and I don't believe they require extreme cooling. But AFAIKT, they aren't being explored. Perhaps there's something really wrong with them, but I suspect that it's that th
Re: (Score:2)
Bullshit. Scalability will never be there because the little problems you so carelessly gloss over are not engineering problems. They are problems with basic Physics. Sure, if you ignore basic Physics, CQs will work just dandy. But so do zero-energy machines and practical FTL flight.
Re: (Score:2)
There are lots of things that you can't do one way because of basic physics that can be done in a different way. We don't KNOW that quantum computers are one of them, but we sure don't know that it isn't.
For that matter, there are apporaches to FTL flight that should work. Unfortunately they seem to require things like a compact mass the mass of Jupiter. Stabilizing wormholes looks like it requires something that's not only negative energy, but also strong. Those are pretty good obstacles, but there mig
If your data is important, it's already decoded (Score:2)
It may not be decoded now, but it will be in the future, and it's very likely that the nasty US TLAs and their nasty Big Data friends are recording all data transmitted on the internet today for decryption later.
In short: today's encryption may be safe today, but it's not future proof.
Or said another way: if you have anything important to send to someone, even if it's encrypted, make sure the data can't come back to haunt you later anyway.
Re: (Score:2)
That is very unlikely. The problem is they cannot decrypt ("decode" is the wrong word and applies to a different problem) everything. So they have to target and be selective. They cannot do that based on importance of data, because they can only know that after the fact. What they can do is try to decrypt everything for a small number of targets where they could not directly break in. And even there they will miss a lot because they will not be able to identify all or even most communications by these peopl
Re: (Score:2)
On the other hand, few people care if it gets decoded decades after their death. There's a large pile of documents from WWI and WWII that are at most of interest to historians (if even that) for example.
Re: (Score:2)
Heisenberg's joke (Score:2)
So they're saying that we should hold to the principle of uncertainty when it comes to quantum computers?
Who stands to benefit (Score:2)
Dice (Score:5, Interesting)
If Einstein was right and God doesn't play dice with the universe then Quantum Computers won't work the way we expect. That's not as far fetched an idea as you might think: Quantum theory suffers many of the symptoms that the astronomical theory of Epicycles did: as our experimental data gets more precise, Quantum theory keeps predicting results just outside the error bars, requiring little tweaks and additions to bring the predictions back in line.
The implication is that quantum theory has some major flaw buried in it's assumptions. Epicycles, for example, assumed that the Earth was the center of reality around which everything else moved. Despite having substantial predictive value, Epicycles was wrong. Now that we know why, it's hard to understand how anyone could have thought it correct.
Quantum theory sits atop the expectation that base reality is comprised of probability equations that only collapse to a fixed state when measured. It's the scientific equivalent of saying that not only doesn't a tree which falls unnoticed in the forest make a sound, it has neither fallen nor stood until someone bothers to check. It might even have deflected itself while falling and landed somewhere else entirely. Evidence for that, such as the electron double-slit experiment, is remarkably strong. But if it's wrong, if there's another explanation, then quantum computers can't work at a scale that allows meaningful computation. They can only fake it.
I suspect it's wrong. I could buy probability as the basis for reality, but the observer effect just doesn't make any sense. Something else has to be going on there.
I think the attempts to build quantum computers will turn out like the Michelson-Morley experiments -- disproving the thing they were constructed to measure. The interferomerters they built measured the speed of light, but that's not what they were built for. They were constructed to measure our motion through the luminiferous ether. Instead, they essentially proved that the theory was wrong: the luminiferous ether does not exist.
It was this data which allowed Einstein to imagine Relativity, a theory that proved itself correct.
Just my semi-crazy opinion.
Re: (Score:2)
Well, we do know that Quantum Theory, General Relativity, or both are wrong. We just don't know which. And we know that any correct theory will need to predict precisely a tremendous amount of observational evidence.
So perhaps Quantum Theory needs to be replaced, but the replacement will predict pretty much exactly the same things current Quantum Theory predicts at every place we can look. And that includes the things that quantum computational devices have done so far.
Re:Dice (Score:5, Insightful)
Relativity hasn't changed since it was first produced a century ago. It keeps predicting results solidly inside the error bars even as the experimental measurement error has narrowed by orders of magnitude.
Quantum theory is headed the other direction. Every time the data precision substantially improves, something gets added to the theory to explain the data.
If you're a betting man, bet that Relativity is right, at least to the same extent that Newton's laws of motion were right.
Re: (Score:2)
Sorry, but relativity has changed a couple of times, having to do with the expansion of the universe. See "Einstein's greatest mistake". (I'm referring to the "cosmological constant", I think it's often called lambda.) First he put it in, and then he took it out, and I think now it's back in again.
Re: (Score:2)
Quantum mechanics is one of the most successful theories in the history of physics. It's accurate to like, 15 orders o
Re: (Score:1)
> Quantum theory sits atop the expectation that base reality is comprised of probability equations that only collapse to a fixed state when measured.
Most don't claim it *is* equations, but rather something that acts like them.
Re: (Score:2)
If Einstein was right and God doesn't play dice with the universe then Quantum Computers won't work the way we expect.
Those of us who follow many-worlds as envisioned by Hugh Everett think that the world doesn't play dice either: the single wavefunction describing the configuration of the entire universe always evolves just according to the Schrodinger equation, which is not probabilistic. It takes some work, but from this postulate you can derive that the apparent classical universe can split in the same mathematical way that a compass needle can point northeast, and not just north or east. Through evolution under the S
Re: (Score:2)
To have multiple divergent futures without matter and energy springing from nowhere to fuel them, you need to have just as many convergent pasts. Conservation of mass-energy is a pretty strong theory, so where's the evidence of convergent pasts?
Re: (Score:2)
Good questions. Sean Carroll has written a fantastic blog post about exactly your point: https://www.preposterousuniver... [preposterousuniverse.com].
Hope that helps!
Re: (Score:1)
Maybe different or older timelines pop OUT of existence to conserve matter and energy. It's kind of like cache memory, except without a disk. People who pop out of existence wouldn't know the difference, they are not around to ponder their non-existence. We have survivor bias.
Influence campaign? (Score:1)
For you (Score:2)
For all you contrarians!
Everyone has a point of view (Score:2)
My own is that quantum computers are evolving at around a quarter of the rate of classical computers at the same level of sophistication.
On this basis, I would not expect serious business uses for quantum computers in much under 40 years, assuming that solutions can be found for the problems and civilization survives global warming.
If we do not see quantum mainframes comparable in sophistication to mid-50s mainframes in 40 years' time, we're not going to. Either the technology is not achievable with the app
quantum computer detectors needed (Score:2)
Would Simon Newcombe like a word? (Score:2)
Didn't Kelvin and Newcombe prove using thermodynamics that heavier-than-air flight was impossible given current materials science?
Re: (Score:1)
Big difference, we've had quantum computers for 35 years, and they're still useless. This would be like Orville and Wilbur making a five foot hop and in the present day we could make a plane with only 50 foot hop.
So wait another 25 years, maybe something will come of this stuff, but don't hold your breath.
The reversibility of quantum systems (Score:4, Interesting)
I stick with my original judgement. The lack of progress in quantum computing does not surprise me at all. The laws of physics will not change any time soon.
Nothing New (Score:1)