Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming Google

First Browser-Based Quantum Computer Simulator Released 61

greg65535 (1209048) writes "Following the trend of on-line coding playgrounds like JSFiddle or CodePen, Google researchers unveiled the first browser-based, GPU-powered Quantum Computing Playground. With a typical GPU card you can simulate up to 22 qubits, write, debug, and share your programs, visualize the quantum state in 2D and 3D, see quantum factorization and quantum search in action, and even... execute your code backwards."
This discussion has been archived. No new comments can be posted.

First Browser-Based Quantum Computer Simulator Released

Comments Filter:
  • by Anonymous Coward on Wednesday May 21, 2014 @09:16PM (#47062605)
    They both released it and didn't release it simultaneously.
    • by Travis Mansbridge ( 830557 ) on Wednesday May 21, 2014 @09:21PM (#47062621)
      Until we observed this article.
      • Re: (Score:3, Funny)

        At which point it collapsed into a state where it has been released, but fails to be able to do anything useful.

      • Actually, quantum theory says nearly that.

        In Chaos theory, things appear random because they are deterministic but you don't have perfect information to calculate the result. Your lack of information introduces randomness. Dice, for example, fall based on their mass, their momentum, air density, the shape and material properties of the surface, etc. These things are themselves imparted by how they're thrown, by the temperature and humidity and make-up of the air, and so on. If you could know all of the

    • This joke is both funny and not funny now.

      Oh, wait... its waveform just collapsed. Guess what to?

  • It prints "I buried Paul".
  • by quax ( 19371 ) on Wednesday May 21, 2014 @10:56PM (#47062967)

    This is actually a requirement for such a simulator as all unitary QM transformations are reversible.

    It's kind of ironic that Google released this project given that they are at the same time heavily betting on D-Wave with a radically different approach to QM than the Gate based model. [wavewatching.net]

    The D-Wave founder Geordie Rose is know for disparaging the Quantum Gate based model as completely impractical, and in turn other QC researchers have been very critical of his approach to the matter. Spawning a contentious controversy almost as old as the Canadian start-up itself. [wavewatching.net]

    • It might be interesting if they introduced some user-selectable amounts of simulated decoherance, though -- perhaps to allow for simulation of quantum error correction, etc. Looking at this locally, it could be non-unitary (though I'm not sure the extent of the environment that one would model for such a computer simulator). Fun stuff, in any event.
      • by quax ( 19371 )

        A simple way to simulate this to some extend would be to just add some random noise in form of qubit flips.

        But with just 20 qubits you unfortunately can't push this very far.

    • no, it's not ironic. the simulator is just a fucking project they're hosting because it's 'cool'. they are not investing anything in it beyond a smidgen of bandwidth and disk space, and they are not endorsing it.

      they're also not "betting heavily" on D-wave. it was a stab-in-the-dark just-in-case thing which they could afford with the coins under Sergei's couch cushions, and despite that i wouldn't be surprised if they're still regretting how hopeless their investment turned out to be. D-wave is bullshit.

      *:

    • by Beck_Neard ( 3612467 ) on Thursday May 22, 2014 @12:53AM (#47063375)

      Well let's compare. Geordie Rose spent years and millions of dollars trying (and succeeding) in building a computational device that works on radically different principles than existing computer tech, is actually useful for a lot of real-world tasks, and consumes virtually zero power - a huge feat in itself, even if it's not really a "quantum computer" in the traditional sense of the word. Whereas those people disagreeing with him are all ivory tower academics who have not built and do not plan to build any hardware. The most egregious of which is Scott Aaronson who is known for his delusional rants on everything from neuroscience to fundamental physics. I wonder which one has their head grounded more firmly in reality.

      But seriously though, the fundamental principles of gate-based and adiabatic quantum computing aren't that different; it's more a continuum where on one end you have highly decoherent classical behavior, on the other you have pure quantum behavior, and in the middle you have quantum+noise behavior where tiny entanglements are being generated and decohered on a rapid scale that is too short to do quantum computing but long enough to do adiabatic quantum computing. It's possible that by investing in AQC technology, as the technology matures it will give better and better entanglement and eventually approach a pure quantum computer in capability.

    • All of physics is reversible (except perhaps black holes and such), so your regular computer should be reversible too.

      • Re: (Score:2, Informative)

        by Anonymous Coward

        ...so your regular computer should be reversible too.

        For a regular computer to be reversible it needs reversible logic gates. For example, a standard XOR gate loses one bit of information, so given the output you cannot construct the input perfectly (as there are two possible inputs for each output).

        • ...so your regular computer should be reversible too.

          For a regular computer to be reversible it needs reversible logic gates. For example, a standard XOR gate loses one bit of information, so given the output you cannot construct the input perfectly (as there are two possible inputs for each output).

          But the output from the opcode isn't stored back to both input memory locations at once ergo, XOR itself is reversible at the chip level, even if it writes back to one of the inputs just XOR the output with the other input. You're conflating the theory of computation with the actual computation. In THEORY you can delete bits, but in practice you actually can't -- Well, using the arrow of time created by sub-atomic entropy (quantum foam) you might be able to... but that will remain beyond your grasp for so

          • Seriously, this was one of the most interesting comments I've ever read here on Slashdot. Thanks for taking the time to write it.
            • by quax ( 19371 )

              While he makes some good points he is unfortunately completely missing the point with this statement:

              " In THEORY you can delete bits, but in practice you actually can't ."

          • by quax ( 19371 )

            " In THEORY you can delete bits, but in practice you actually can't ."

            If I give you a bunch of RAM SIMs there's no way you can tell me what was written on them.

            At any rate, fully reversible computing means the ability to completely reverse arbitrarily complex algos, being able to reconstruct a couple of previous bit states isn't cutting it.

            And yes, you actually can delete bits, the entropy heat signature this produces is theoretically well understood, and Landauer's principle has recently been experimentall [wavewatching.net]

      • When you reverse a black hole it's called a white hole, AKA, big bang.

        • The big bang was a hell of a lot more complex than that. If you ignited a miniature big bang in our universe, it would likely destroy the universe. Yes, even a really tiny one.

    • by quax ( 19371 )

      Should have included this in the previous comment, but couldn't find the link at first.

      What I did use occasionally when taking the course was this little browser based gem [davyw.com]. While certainly not nearly as powerful as this Google simulator it was still quite useful.

  • Ignoring the typical Slashdot cynicism (and often lack of understanding disguised as such), this is actually pretty damn neat! Quantum mechanics and quantum computing using the gates model aren't intuitive, especially not for people without a physics background, so this could really help learning the fundamentals of quantum computing. Being able to visualize the state of the qubits at each step of the process as something other than a big formula is a pretty big deal.

    As it is right now, QC is pretty much
    • by hweimer ( 709734 )

      Isn't it ironic that a consumer graphics card can simulate more qubits than most actual quantum computers have right now?

      No. If it were the other way around then quantum computing wouldn't be an open research problem but a multi-billion dollar industry.

If mathematically you end up with the wrong answer, try multiplying by the page number.

Working...