Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming

Quantum Computing Gets a 'Hard, Cold Reality Check' (ieee.org) 67

A Canadian cybersecurity firm has warned that as soon as 2025, quantum computers could make current encryption methods useless.

But now Slashdot reader christoban shares a "reality check" — an IEEE Spectrum takedown with the tagline "Hype is everywhere, skeptics say, and practical applications are still far away." The quantum computer revolution may be further off and more limited than many have been led to believe. That's the message coming from a small but vocal set of prominent skeptics in and around the emerging quantum computing industry... [T]here's growing pushback against what many see as unrealistic expectations for the technology. Meta's head of AI research Yann LeCun recently made headlines after pouring cold water on the prospect of quantum computers making a meaningful contribution in the near future.

Speaking at a media event celebrating the 10-year anniversary of Meta's Fundamental AI Research team he said the technology is "a fascinating scientific topic," but that he was less convinced of "the possibility of actually fabricating quantum computers that are actually useful." While LeCun is not an expert in quantum computing, leading figures in the field are also sounding a note of caution. Oskar Painter, head of quantum hardware for Amazon Web Services, says there is a "tremendous amount of hype" in the industry at the minute and "it can be difficult to filter the optimistic from the completely unrealistic."

A fundamental challenge for today's quantum computers is that they are very prone to errors. Some have suggested that these so-called "noisy intermediate-scale quantum" (NISQ) processors could still be put to useful work. But Painter says there's growing recognition that this is unlikely and quantum error-correction schemes will be key to achieving practical quantum computers. The leading proposal involves spreading information over many physical qubits to create "logical qubits" that are more robust, but this could require as many as 1,000 physical qubits for each logical one. Some have suggested that quantum error correction could even be fundamentally impossible, though that is not a mainstream view. Either way, realizing these schemes at the scale and speeds required remains a distant goal, Painter says... "I would estimate at least a decade out," he says.

A Microsoft technical fellow believes there's fewer applications where quantum computers can really provide a meaningful advantage, since operating a qubit its magnitudes slower than simply flipping a transistor, which also makes the throughput rate for data thousands or even millions of times slowers.

"We found out over the last 10 years that many things that people have proposed don't work," he says. "And then we found some very simple reasons for that."
This discussion has been archived. No new comments can be posted.

Quantum Computing Gets a 'Hard, Cold Reality Check'

Comments Filter:
  • Unbiased Opinion (Score:5, Interesting)

    by Midnight_Falcon ( 2432802 ) on Sunday December 24, 2023 @12:54PM (#64103323)
    CEO of a "Canadian cybersecurity company" which only sells security services related to quantum-proofing your encryption, makes outlandish claims that quantum computing can break RSA etc within the next two years. That probably means he has less than a year or two of runway left and needs to use scare tactics to get more clients to signup for expensive services.

    While I think most will agree one day quantum computing will be able to quickly crack the encryption of today, I think the likelihood of it occurring in 2025 is extremely low. By the time it actually happens somewhere in the 2030s or later, the internet will have transitioned to different (quantum-resistant) encryption algorithms just like we retired RC5, DES, and SSL (and TLS v1.3+). We do not need to pay this CEO's company in the meantime.

    • by ArmoredDragon ( 3450605 ) on Sunday December 24, 2023 @01:10PM (#64103357)

      My understanding is quantum computers aren't even going to change anything other than enabling some algorithms that Turing computers can't perform directly. In this case, the big deal is Shor's algorithm for factoring large polynomials.

      • by gweihir ( 88907 )

        Your understanding is correct. For most algorithms, QCs are completely inefficient and would be dog-slow compared to anything classical.

        As to Shor's algorithm, that is one of the few examples of practical applicable algorithms that may or may not be within reach. But keep in mind that Shor's requires a long an complex calculation, which will drive error-correction effort up dramatically. It also requires a lot of effective (error-corrected) QBits. For example, breaking RSA-4096 requires something like 16k e

        • by HBI ( 10338492 )

          You have too short a time frame to extrapolate the rate of change. Nothing meaningful was being done with QC in 1970 or even 1975. Digital computers were not even common then. Also, you aren't considering potential technological improvements. One of the big issues today is cryogenic requirements. Change that and things will alter rapidly.

          The only thing to agree about is that 2025 is ludicrous. The CEO who said that should have to wear a sign around his neck with that statement on it for the rest of hi

          • by gtall ( 79522 )

            "One of the big issues today is cryogenic requirements. Change that and things will alter rapidly."

            I don't have a pink unicorn. Change that and I'll be rich.

          • by gweihir ( 88907 )

            One of the big issues today is cryogenic requirements. Change that and things will alter rapidly.

            Sure. Change basics Physics and QCs will run just fine! Obviously, everything becomes easy to do if you ignore reality ...

          • It sounds like we need more qubits for it to even compete with classical computers, and the more qubits you add, the error rate increases exponentially. So we're just waiting for a breakthrough to either reduce the error rate or make it less impactful, which could come a year from now, a hundred years from now, or it may never come and we need to use increasing amounts of energy to get closer and closer to absolute zero to minimize entropy. And if that's the case, you almost may as well not even bother, bec

          • There was tons of stuff being done in the 70s with QC, there just wasnâ(TM)t any practical application since there was no quantum computer. There was tons of stuff being done on classical computing in the 1850s (Charles Babbage and Ada Lovelace being a prime example) well over 100 years before the invention of the microchip that made it practical to build an affordable system.

            We are barking up the same tree with QC right now, weâ(TM)re probably going down the wrong path like Babbage, but it has to

        • Where does your "16k" come from? Seems like the product of every integer below the 1024-bit square root of a 2048-bit number, multiplied by every other 2048-bit integer, doesn't even require anywhere near 16k qbits to store, let alone thousands more qbits to operate on.
          • That's a very good question. The answer is long and complicated though, and you might already know some of the answer, so I'm going to summarize it here rather than going into all the details. Feel free to search each term if you want more depth.

            Quantum computers only work inasmuch as their Qbits follow the Schrodinger equation. The Schrodinger equation is a linear, unitary, partial differential equation. Any equation that's unitary can be run forward or backward in time without losing information - you

    • This isn't some ground breaking revelation by some obscure CEO of a Canadian security company. RSA is known to not be quantum safe (there are other encryption algorithms that claim to be quantum safe). This prediction has been made by security researchers, encryption experts, and experts in the field for years. Maybe decades. So Facebook, Microsoft, et al are now telling us, to ignore decades of warnings... Yeah, no. I'll listen to experts who dont have an incentive reading encrypted communications. Thanks!
      • The problem is theoretical, and these warnings have been repeated for over twenty years now. If you ask "experts in the field," (appeal to authority fallacy incoming) practically being able to crack RSA with a quantum computer is many years away..that's what the entire article we're commenting on is about.

        As I said, one day quantum will break non-quantum-resistant encryption algorithms, it's just not going to happen in 2025.

        • by ceoyoyo ( 59147 )

          The problem isn't theoretical. You say yourself, "it's many years away." That makes it an engineering problem, and the questions are how many years, and how expensive.

          If you really want to keep something secret for the next twenty years, ten years, or, just possibly two years, because who knows what the NSA is doing, you should probably use SHA256. If it's anything less than that and you haven't pissed off the US, elliptic curves are almost certainly fine.

          • I hope you know that SHA256 is superceded by the SHA3 family already, and is not an encryption algorithm. It's a hashing algorithm meant for one way usage for things like passwords. It is susceptible to rainbow table attacks and requires use of salts, peppers, nonces etc for uniqueness. So it's clear you probably haven't used cryptography in the field very much, or gotten an update in the last several years.
            • by Entrope ( 68843 )

              The SHA-2 family, of which SHA-256 is one member, is not superseded by the SHA-3 family. Both are recognized -- by US NIST and other government and standards bodies -- as secure against currently known and reasonably foreseen threats. (NIST even still permits use of SHA-1 in some situations, but will remove that from the list of permitted cryptographic hash algorithms in the relatively near future.) The US government recommends that anyone using any of these standards make plans to support SHA-3 in addit

        • As a useful reference for people who find the math and physics behind this a bit too complex to grasp, here's a simplified explanation that anyone should be able to readily understand [auckland.ac.nz].
      • by Burdell ( 228580 )

        It's not just RSA; just about every algorithm currently widely used is not quantum safe (and that's widely known). However, for that to matter requires actual working and useful quantum computers, which so far appear to be like fusion power, perpetually 20 years away.

        But a CEO of a company warning they're a year away instead, so you better buy their product NOW, is pure marketing.

        • by ceoyoyo ( 59147 )

          No, it's basically RSA and elliptic curve. Asymmetric algorithms only. AES is fine, although you might want to stop using short keys that aren't recommended anyway.

      • by sjames ( 1099 )

        No, they're saying it won't happen tomorrow and probably not in 5 or 10 years. Possibly never.

        The cryptographers aren't wrong to warn that it COULD happen at some point. The two statements aren't even logically incompatible.

        Conclusion: It's not a bad idea to move calmly over to less vulnerable public key systems. But don't lift the Molly guard on the panic button.

    • Encryption used when I first got on the internet in 1994/5 can be casually broken by your average smart phone in seconds. Encryption used on the internet in 2005 can be broken by your average desktop in minutes. Encryption used on the internet in 2015 can be broken by a home crypto mining rig in less than an hour. The idea that the encryption we're using at the end of 2023 could be broken trivially in 2035 doesn't require quantum computing. It's just normal. This is more a product of a rather human inabilit

      • "This is more a product of a rather human inability to properly conceptualize exponential growth curves rather than an existential problem."

        We can do the math, we just did it in favor of a cheaper onboard encryption engine. Why spend 2x as much to protect against something that may happen 2x further in the future than the useful lifespan of the computing device?

      • by gweihir ( 88907 )

        Your numbers are flawed. There is not enough data points to infer the trend you are predicting and looking at actual details makes your argument entirely ludicrous.

      • Well, relevant to this discussion is the asymmetric key exchange. You are technically right about mid 90s, because they deliberately limited for "export controls".

        However, everything else is a bit overly pessimistic. 1024 bit is only recently reasonably susceptible to obtainable horsepower.

        Symmetric algorithms, similar deal, but a moot point for hypothetical quantum. AES was late 90s and as a cipher is still strong.

        Hashing has had some headaches, but SHA-2 was in 2001 and it's still considered fit for pu

      • RSA-1024 was already widely used over twenty years ago. It has yet to be reported that it has been broken in a reasonable time span. The closest factorization of that kind of integers currently reported is for one 829 bits long. That is 195 bits fewer than a typical RSA-1024. It took months of work in modern hardware. And each extra bit in the integer to be factorized doubles the amount of work to be done.
  • that has been factored?

    Isn't it 21? And they had to tell the computer that 3 was one of the factors? Yeah, any day now.

    I just keep posting that from my history.....

    • Re: (Score:2, Troll)

      by gweihir ( 88907 )

      Indeed. Much like some other research communities (AI, I am looking at you), the QC community is largely operating with fakes, lies and baseless claims when communicating with the public. They all know their stuff is theoretical fundamental research and no practical application is anywhere in sight. They all know that if they admit that, their funding will dry up. So they lie.

      • AI at least has very accessible and compelling demonstrations. There are struggles around where it will and will not be relevant, and lots of grift in the industry around that uncertainty. However there are a lot of "AI" methodologies with very interesting applications, even if I'm not crazy about the name and expectations are being managed poorly.

        • by gweihir ( 88907 )

          Sure. AI has yet another broken and very limited mechanism that gets hyped all out of proportion. QCs do still not exist and all "demonstrations" are essentially fakes that miss most elements of a working QC.

          Does not change the fact that most publicly visible actors on the proponent side in both spaces are blatantly lying about the actual state of affairs.

  • You should not form any opinions until you have listened to https://yewtu.be/watch?v=QUGna... [slashdot.org]>his talk on quantum reality vs hype.
  • by gweihir ( 88907 ) on Sunday December 24, 2023 @01:16PM (#64103369)

    It starts with quantum error correction not being perfect. You still get decoherence in longer or more complex calculations. Then, complexity of error correction adds massively to the complexity of the machine, because suddenly you need to have a lot more QBits and they still need to _all_ get entangled. That overhead for error correction gets worse both in the size of the machine and in the length of the computation.

    Sorry, scalability is just not there and that is a _fundamental_ problem that cannot be solved. Classical computing only ever scaled up because classical computations can be subdivided to individual bit operations. These days even classical computing is running into scalability problems for a lot of problems, bit that is a different discussion. Anyways, it looks very much like QCs scale worse much than linear, probably inverse-exponential. That very likely means they will never even get where current conventional computers are.

    No idea why people keep ignoring reality here, but the current AI hype, for example, is a nice indicator that many people are just not very smart and have no clue what they are talking about.

    • "No idea why people keep ignoring reality here, but the current AI hype, for example, is a nice indicator that many people are just not very smart and have no clue what they are talking about."

      Same reason for every other religion. Its easier to Believe (TM) than to know complex concepts. And religion has inertia of mass appeal, where knowledge has friction of provability.

      • by gweihir ( 88907 )

        Same reason for every other religion. Its easier to Believe (TM) than to know complex concepts. And religion has inertia of mass appeal, where knowledge has friction of provability.

        Yes, probably. But the sheer ignorance and bright-eyed willingness to believe does still astonish me in this day and age where knowledge is actually available to everybody. It seems a lot of people just do not want it and prefer their made-up surrogate reality.

    • by HiThere ( 15173 )

      Your objections are correct, but not sufficient. Yes, there are problems that really need to be solved before scaling can be done, but if they are solved, then the scaling can be done extremely quickly.

      OTOH, I do have some questions about whether the problems CAN be solved. Perhaps. The ideas I like best involve nitrogen spin states, and I don't believe they require extreme cooling. But AFAIKT, they aren't being explored. Perhaps there's something really wrong with them, but I suspect that it's that th

      • by gweihir ( 88907 )

        Bullshit. Scalability will never be there because the little problems you so carelessly gloss over are not engineering problems. They are problems with basic Physics. Sure, if you ignore basic Physics, CQs will work just dandy. But so do zero-energy machines and practical FTL flight.

        • by HiThere ( 15173 )

          There are lots of things that you can't do one way because of basic physics that can be done in a different way. We don't KNOW that quantum computers are one of them, but we sure don't know that it isn't.

          For that matter, there are apporaches to FTL flight that should work. Unfortunately they seem to require things like a compact mass the mass of Jupiter. Stabilizing wormholes looks like it requires something that's not only negative energy, but also strong. Those are pretty good obstacles, but there mig

  • It may not be decoded now, but it will be in the future, and it's very likely that the nasty US TLAs and their nasty Big Data friends are recording all data transmitted on the internet today for decryption later.

    In short: today's encryption may be safe today, but it's not future proof.

    Or said another way: if you have anything important to send to someone, even if it's encrypted, make sure the data can't come back to haunt you later anyway.

    • by gweihir ( 88907 )

      That is very unlikely. The problem is they cannot decrypt ("decode" is the wrong word and applies to a different problem) everything. So they have to target and be selective. They cannot do that based on importance of data, because they can only know that after the fact. What they can do is try to decrypt everything for a small number of targets where they could not directly break in. And even there they will miss a lot because they will not be able to identify all or even most communications by these peopl

    • by sjames ( 1099 )

      On the other hand, few people care if it gets decoded decades after their death. There's a large pile of documents from WWI and WWII that are at most of interest to historians (if even that) for example.

    • The encrypted data of the vast majority of people is not important enough to warrant the attention, time and resources of those government agencies with the potential capability to break that encryption. Plus it is vastly cheaper, faster and simpler for them to use other approaches: for example, if the NSA wants to see the contents of my encrypted hard drive they just have to break unnoticed into my place, install a small and conveniently located camera to find out how I decrypt my hard drive, and then stea
  • So they're saying that we should hold to the principle of uncertainty when it comes to quantum computers?

  • The quantum computer discussion is like most of our public discourse in the internet age. The question that should be asked of every piece of "information" on the internet is 'who stands to benefit"? There isn't any information out there that didn't pass through self-interest filters. Usually the benefit is monetary, since it costs money to get your story seen on the internet and if there is no money to do that it doesn't happen. That said, the ability to break encryption is clearly a devastating military
  • Dice (Score:5, Interesting)

    by Spazmania ( 174582 ) on Sunday December 24, 2023 @01:57PM (#64103425) Homepage

    If Einstein was right and God doesn't play dice with the universe then Quantum Computers won't work the way we expect. That's not as far fetched an idea as you might think: Quantum theory suffers many of the symptoms that the astronomical theory of Epicycles did: as our experimental data gets more precise, Quantum theory keeps predicting results just outside the error bars, requiring little tweaks and additions to bring the predictions back in line.

    The implication is that quantum theory has some major flaw buried in it's assumptions. Epicycles, for example, assumed that the Earth was the center of reality around which everything else moved. Despite having substantial predictive value, Epicycles was wrong. Now that we know why, it's hard to understand how anyone could have thought it correct.

    Quantum theory sits atop the expectation that base reality is comprised of probability equations that only collapse to a fixed state when measured. It's the scientific equivalent of saying that not only doesn't a tree which falls unnoticed in the forest make a sound, it has neither fallen nor stood until someone bothers to check. It might even have deflected itself while falling and landed somewhere else entirely. Evidence for that, such as the electron double-slit experiment, is remarkably strong. But if it's wrong, if there's another explanation, then quantum computers can't work at a scale that allows meaningful computation. They can only fake it.

    I suspect it's wrong. I could buy probability as the basis for reality, but the observer effect just doesn't make any sense. Something else has to be going on there.

    I think the attempts to build quantum computers will turn out like the Michelson-Morley experiments -- disproving the thing they were constructed to measure. The interferomerters they built measured the speed of light, but that's not what they were built for. They were constructed to measure our motion through the luminiferous ether. Instead, they essentially proved that the theory was wrong: the luminiferous ether does not exist.

    It was this data which allowed Einstein to imagine Relativity, a theory that proved itself correct.

    Just my semi-crazy opinion.

    • by HiThere ( 15173 )

      Well, we do know that Quantum Theory, General Relativity, or both are wrong. We just don't know which. And we know that any correct theory will need to predict precisely a tremendous amount of observational evidence.

      So perhaps Quantum Theory needs to be replaced, but the replacement will predict pretty much exactly the same things current Quantum Theory predicts at every place we can look. And that includes the things that quantum computational devices have done so far.

      • Re:Dice (Score:5, Insightful)

        by Spazmania ( 174582 ) on Sunday December 24, 2023 @02:50PM (#64103549) Homepage

        Relativity hasn't changed since it was first produced a century ago. It keeps predicting results solidly inside the error bars even as the experimental measurement error has narrowed by orders of magnitude.

        Quantum theory is headed the other direction. Every time the data precision substantially improves, something gets added to the theory to explain the data.

        If you're a betting man, bet that Relativity is right, at least to the same extent that Newton's laws of motion were right.

        • by HiThere ( 15173 )

          Sorry, but relativity has changed a couple of times, having to do with the expansion of the universe. See "Einstein's greatest mistake". (I'm referring to the "cosmological constant", I think it's often called lambda.) First he put it in, and then he took it out, and I think now it's back in again.

    • by PJ6 ( 1151747 )
      Any explanation with the word "measurement" in it is just a simplification: apparent wave function collapse comes from the observer getting entangled with the system they're observing, effectively making them part of the equation. If you don't like the many worlds interpretation, you can also model it as irreversibility as a result of how all information processors (i.e., your brain) work.

      Quantum mechanics is one of the most successful theories in the history of physics. It's accurate to like, 15 orders o
    • by Tablizer ( 95088 )

      > Quantum theory sits atop the expectation that base reality is comprised of probability equations that only collapse to a fixed state when measured.

      Most don't claim it *is* equations, but rather something that acts like them.

    • If Einstein was right and God doesn't play dice with the universe then Quantum Computers won't work the way we expect.

      Those of us who follow many-worlds as envisioned by Hugh Everett think that the world doesn't play dice either: the single wavefunction describing the configuration of the entire universe always evolves just according to the Schrodinger equation, which is not probabilistic. It takes some work, but from this postulate you can derive that the apparent classical universe can split in the same mathematical way that a compass needle can point northeast, and not just north or east. Through evolution under the S

      • To have multiple divergent futures without matter and energy springing from nowhere to fuel them, you need to have just as many convergent pasts. Conservation of mass-energy is a pretty strong theory, so where's the evidence of convergent pasts?

        • Good questions. Sean Carroll has written a fantastic blog post about exactly your point: https://www.preposterousuniver... [preposterousuniverse.com].

          Hope that helps!

        • by Tablizer ( 95088 )

          Maybe different or older timelines pop OUT of existence to conserve matter and energy. It's kind of like cache memory, except without a disk. People who pop out of existence wouldn't know the difference, they are not around to ponder their non-existence. We have survivor bias.

  • What are the odds that one or two governments already have much more advanced QCs running at scale? Iâ(TM)m not talking mass surveillance but I think itâ(TM)s quite plausible that the NSA of the US and Chinaâ(TM)s agencies as well would have access to one or several large QCs which could be used in a focused way to break encryption. Honestly I donâ(TM)t think itâ(TM)s far fetched looking at history, to imagine that the CIA is pushing for certain messaging through an influence campa
  • For all you contrarians!

  • My own is that quantum computers are evolving at around a quarter of the rate of classical computers at the same level of sophistication.

    On this basis, I would not expect serious business uses for quantum computers in much under 40 years, assuming that solutions can be found for the problems and civilization survives global warming.

    If we do not see quantum mainframes comparable in sophistication to mid-50s mainframes in 40 years' time, we're not going to. Either the technology is not achievable with the app

  • then just block access when detected, firewall em off, block em with the hosts file, since they are such a threat then they deserve to be denied access
  • Didn't Kelvin and Newcombe prove using thermodynamics that heavier-than-air flight was impossible given current materials science?

    • Big difference, we've had quantum computers for 35 years, and they're still useless. This would be like Orville and Wilbur making a five foot hop and in the present day we could make a plane with only 50 foot hop.

      So wait another 25 years, maybe something will come of this stuff, but don't hold your breath.

  • by Swordfish ( 86310 ) on Sunday December 24, 2023 @08:20PM (#64104057) Homepage
    In 1995 I told a defence research guy who was studying the "threat" of quantum computing to defence encryption algorithms that there was nothing to worry about because quantum systems are reversible whereas computer systems are not. In other words, quantum systems don't have increasing entropy over time, whereas all computers do. You can't do computing without making irreversible changes to the state of the computer. They are quintessentially macroscopic in nature. He said this was an obstacle, but he thought the obstacle could be overcome.

    I stick with my original judgement. The lack of progress in quantum computing does not surprise me at all. The laws of physics will not change any time soon.

  • The hype goes back at least to 1997 when I had a tour of the QC lab at Los Alamos National Laboratory. They had mathematicians working on the error-correction problem at the time but I'm guessing that they didn't solve it.

A computer lets you make more mistakes faster than any other invention, with the possible exceptions of handguns and Tequilla. -- Mitch Ratcliffe

Working...