First Browser-Based Quantum Computer Simulator Released 61
greg65535 (1209048) writes "Following the trend of on-line coding playgrounds like JSFiddle or CodePen, Google researchers unveiled the first browser-based, GPU-powered Quantum Computing Playground. With a typical GPU card you can simulate up to 22 qubits, write, debug, and share your programs, visualize the quantum state in 2D and 3D, see quantum factorization and quantum search in action, and even... execute your code backwards."
Well, not exactly. (Score:5, Funny)
Re:Well, not exactly. (Score:5, Funny)
Re: (Score:3, Funny)
At which point it collapsed into a state where it has been released, but fails to be able to do anything useful.
Re: (Score:1)
Re: (Score:2)
It fails to do anything useful backwards as well :)
Re: (Score:2)
Actually, quantum theory says nearly that.
In Chaos theory, things appear random because they are deterministic but you don't have perfect information to calculate the result. Your lack of information introduces randomness. Dice, for example, fall based on their mass, their momentum, air density, the shape and material properties of the surface, etc. These things are themselves imparted by how they're thrown, by the temperature and humidity and make-up of the air, and so on. If you could know all of the
Re: (Score:2)
This joke is both funny and not funny now.
Oh, wait... its waveform just collapsed. Guess what to?
When you execute it backwards (Score:2)
Re: (Score:1)
Re: (Score:1)
simulation does not mean it is in real-time, you ignorant slut.
Re: (Score:2)
Simulation as in "simulation", not simulation as in "emulation".
Re: (Score:2)
You mean a simulation like this? /dev/random | grep "the answer to life, the universe and everything" | sed -e 's/the answer to life, the universe and everything/42/'
# cat
Re: (Score:1)
It didn't output 42 for me:
# cat /dev/random | grep "the answer to life, the universe and everything" | sed -e 's/the answer to life, the universe and everything/42/'
Binary file (standard input) matches
Re:If each of those is 22 qubits... (Score:5, Informative)
Would a simple botnet be able to easily crack all encryption crackable by quantum computing, or are there better ways to go at it given a botnet?
Yes it is crackable using a bother simulating a quantum computer, in the same sense that you would be able to simulate a quantum computer solving the traveling salesman problem by using a botnet. Or by using a massively parallel supercomputer.
That is to say, the quantum computer simulation is Turing computable. This really doesn't help for anything more than trivial problems, much like pointing out the Halting Problem is decidable if you "simply" observe the Turing machine for the appropriate Busy Beaver [wikipedia.org] function's number of execution steps.
More succinctly, the simulation would gain you nothing over a direct parallel processing attack on the key space, and in fact the quantum computer simulation would add execution overhead that would reduce efficiency compared to straightforward brute force attacks.
Re: (Score:2)
Yes, I saw that right after I posted. Autocorrect stymied me as I spelled "botnet". Fixed it once, but then it "helped me out" again when I edited the sentence later.
even... execute your code backwards. (Score:5, Insightful)
This is actually a requirement for such a simulator as all unitary QM transformations are reversible.
It's kind of ironic that Google released this project given that they are at the same time heavily betting on D-Wave with a radically different approach to QM than the Gate based model. [wavewatching.net]
The D-Wave founder Geordie Rose is know for disparaging the Quantum Gate based model as completely impractical, and in turn other QC researchers have been very critical of his approach to the matter. Spawning a contentious controversy almost as old as the Canadian start-up itself. [wavewatching.net]
Re: (Score:3)
Re: (Score:2)
A simple way to simulate this to some extend would be to just add some random noise in form of qubit flips.
But with just 20 qubits you unfortunately can't push this very far.
Re: (Score:2)
no, it's not ironic. the simulator is just a fucking project they're hosting because it's 'cool'. they are not investing anything in it beyond a smidgen of bandwidth and disk space, and they are not endorsing it.
they're also not "betting heavily" on D-wave. it was a stab-in-the-dark just-in-case thing which they could afford with the coins under Sergei's couch cushions, and despite that i wouldn't be surprised if they're still regretting how hopeless their investment turned out to be. D-wave is bullshit.
*:
Re: (Score:2)
"D-wave is bullshit."
Tell us how you really feel.
Re:even... execute your code backwards. (Score:4, Interesting)
Well let's compare. Geordie Rose spent years and millions of dollars trying (and succeeding) in building a computational device that works on radically different principles than existing computer tech, is actually useful for a lot of real-world tasks, and consumes virtually zero power - a huge feat in itself, even if it's not really a "quantum computer" in the traditional sense of the word. Whereas those people disagreeing with him are all ivory tower academics who have not built and do not plan to build any hardware. The most egregious of which is Scott Aaronson who is known for his delusional rants on everything from neuroscience to fundamental physics. I wonder which one has their head grounded more firmly in reality.
But seriously though, the fundamental principles of gate-based and adiabatic quantum computing aren't that different; it's more a continuum where on one end you have highly decoherent classical behavior, on the other you have pure quantum behavior, and in the middle you have quantum+noise behavior where tiny entanglements are being generated and decohered on a rapid scale that is too short to do quantum computing but long enough to do adiabatic quantum computing. It's possible that by investing in AQC technology, as the technology matures it will give better and better entanglement and eventually approach a pure quantum computer in capability.
Re: (Score:2)
I'm sorry if my post sounded like a commercial, it's just that I've done a lot of research on D-wave's hardware and it's really impressive what such a small team managed to pull off. At least they're doing something.
Re: (Score:2)
Mighty big roar for an AC.
Re: (Score:2)
What's empty is your straw-man argument. Of course most academics do excellent work.
What the original poster claimed was academics in the QC hardware business dismissing D-Wave. The most outspoken critic is a theorists. Is it too much to ask to get a link to a more hardware oriented academic going on the record with regards to D-Wave?
MIT included them in the list of the fifty smartest companies [technologyreview.com], so we know there are plenty of academics who think highly of D-Wave.
Re: (Score:3)
All of physics is reversible (except perhaps black holes and such), so your regular computer should be reversible too.
Re: (Score:2, Informative)
...so your regular computer should be reversible too.
For a regular computer to be reversible it needs reversible logic gates. For example, a standard XOR gate loses one bit of information, so given the output you cannot construct the input perfectly (as there are two possible inputs for each output).
Re: (Score:1)
...so your regular computer should be reversible too.
For a regular computer to be reversible it needs reversible logic gates. For example, a standard XOR gate loses one bit of information, so given the output you cannot construct the input perfectly (as there are two possible inputs for each output).
But the output from the opcode isn't stored back to both input memory locations at once ergo, XOR itself is reversible at the chip level, even if it writes back to one of the inputs just XOR the output with the other input. You're conflating the theory of computation with the actual computation. In THEORY you can delete bits, but in practice you actually can't -- Well, using the arrow of time created by sub-atomic entropy (quantum foam) you might be able to... but that will remain beyond your grasp for so
Re: (Score:2)
Re: (Score:2)
While he makes some good points he is unfortunately completely missing the point with this statement:
" In THEORY you can delete bits, but in practice you actually can't ."
Re: (Score:3)
" In THEORY you can delete bits, but in practice you actually can't ."
If I give you a bunch of RAM SIMs there's no way you can tell me what was written on them.
At any rate, fully reversible computing means the ability to completely reverse arbitrarily complex algos, being able to reconstruct a couple of previous bit states isn't cutting it.
And yes, you actually can delete bits, the entropy heat signature this produces is theoretically well understood, and Landauer's principle has recently been experimentall [wavewatching.net]
Re: (Score:2)
When you reverse a black hole it's called a white hole, AKA, big bang.
Re: (Score:2)
The big bang was a hell of a lot more complex than that. If you ignited a miniature big bang in our universe, it would likely destroy the universe. Yes, even a really tiny one.
Re: (Score:2)
Have been blogging about them for a while and visited them on site.
Full Disclosure: One of their board members paid a beer for me.
It's because of dudes like you that I am cross with Scott A. He has every right to be critical but his rhetoric is so over the top that he created a kind of parallel universe, that doesn't even allow for this kind of adiabatic quantum computation to be tried and tested.
Would have come in handy ... (Score:1)
... when I took the EdX's CS191x Quantum Mechanics and Quantum Computation course. [wavewatching.net]
Re: (Score:1)
Show me yours, I showed you mine ...
Re: (Score:2)
Should have included this in the previous comment, but couldn't find the link at first.
What I did use occasionally when taking the course was this little browser based gem [davyw.com]. While certainly not nearly as powerful as this Google simulator it was still quite useful.
Neat! (Score:2)
As it is right now, QC is pretty much
Re: (Score:2)
Isn't it ironic that a consumer graphics card can simulate more qubits than most actual quantum computers have right now?
No. If it were the other way around then quantum computing wouldn't be an open research problem but a multi-billion dollar industry.
Re: (Score:2)
If it's possible to simulate qubits using, at the bottom, bits, and, if qubits and quantum computing allow for performing NP calcs in parametric time
Being able to simulate qubits doesn't mean you can do so in parametric time.
One can simulate a few molecules chemically reacting, but you can't reasonably do so at a molecular level for a macroscopic sample - yet in reality both would take a similar amount of time.
* the above is an uninformed guess
parent is full of disinformation (Score:1)
Parent post is so full of (intentional?) disinformation that it hurts.
Why haven't we been doing this for decades? We have. The only novel part here is "in a web browser." Simulation is not a new concept. Any nondeterministic computing problem can be simulated by a deterministic machine, and vice versa.
Second, instruction runtime on the simulated machine does not correlate with the runtime on the physical machine -- at all. A deterministic machine can simulate a nondeterministic one in O(2^n) by trying every
Re: (Score:2)
I agree with you, but I have a nit to pick:
You wrote "512kiB". This is incorrect. It should be "512KiB". Although "k" is the prefix for "kilo-", there is nothing such as an "iB", so the use of "k" is inappropriate here. Note that the prefix "Ki" is for "kibi-" and it applies here to "B" for "bytes."
Re: (Score:1)
Hold on a minute. If it's possible to simulate qubits using, at the bottom, bits, and, if qubits and quantum computing allow for performing NP calcs in parametric time (and hence breaking crypto), then haven't we already been able to do all of these things for decades?
Oblig xkcd - http://www.xkcd.com/505/ [xkcd.com]