Google & NASA claim their quantum computer is 100,000,000 times faster than PC

Started by Syt, December 09, 2015, 07:21:28 AM

Previous topic - Next topic

Syt

http://arstechnica.co.uk/information-technology/2015/12/google-nasa-our-quantum-computer-is-100-million-times-faster-than-normal-pc/

QuoteGoogle, NASA: Our quantum computer is 100 million times faster than a normal PC

Two years ago Google and NASA went halfsies on a D-Wave quantum computer, mostly to find out whether there are actually any performance gains to be had when using quantum annealing instead of a conventional computer. Recently, Google and NASA received the latest D-Wave 2X quantum computer, which the company says has "over 1000 qubits."

At an event yesterday at the NASA Ames Research Center, where the D-Wave computer is kept, Google and NASA announced their latest findings—and for highly specialised workloads, quantum annealing does appear to offer a truly sensational performance boost. For an optimisation problem involving 945 binary variables, the D-Wave X2 is up to 100 million times faster (108) than the same problem running on a single-core classical (conventional) computer.

Google and NASA also compared the D-Wave X2's quantum annealing against Quantum Monte Carlo, an algorithm that emulates quantum tunnelling on a conventional computer. Again, a speed-up of up to 108 was seen in some cases.

Hartmut Neven, the head of Google's Quantum Artificial Intelligence lab, said these results are "intriguing and very encouraging" but that there's still "more work ahead to turn quantum enhanced optimization into a practical technology."

As always, it's important to note that D-Wave's computers are not capable of universal computing: they are only useful for a small number of very specific tasks—and Google, NASA, and others are currently trying to work out what those tasks might be. D-Wave's claim of "over 1,000 qubits" is also unclear. In the past, several physical qubits were clustered to create a single computational qubit, and D-Wave doesn't make that distinction clear.

We will publish a further, in-depth report about Google and NASA's latest findings in the next few weeks.
I am, somehow, less interested in the weight and convolutions of Einstein's brain than in the near certainty that people of equal talent have lived and died in cotton fields and sweatshops.
—Stephen Jay Gould

Proud owner of 42 Zoupa Points.


Darth Wagtaros

Hopefully it'll tell them the difference between Metric and Imperial for their space probes.
PDH!

frunk

The blog that I used to read for quantum computing information hasn't posted anything yet.

Shtetl Optimized

He's been a long term sceptic that D-Wave's products are practical or useful as they currently work.

viper37

Quote from: frunk on December 09, 2015, 12:16:14 PM
The blog that I used to read for quantum computing information hasn't posted anything yet.

Shtetl Optimized

He's been a long term sceptic that D-Wave's products are practical or useful as they currently work.
the caveat is here:
Quotethere's still "more work ahead to turn quantum enhanced optimization into a practical technology."

As always, it's important to note that D-Wave's computers are not capable of universal computing: they are only useful for a small number of very specific tasks—and Google, NASA, and others are currently trying to work out what those tasks might be.

It's not meant to replace PC any time soon.
Maybe, eventually, it could replace part of a PC, some specific chips.
I don't do meditation.  I drink alcohol to relax, like normal people.

If Microsoft Excel decided to stop working overnight, the world would practically end.

frunk

Quote from: viper37 on December 09, 2015, 02:10:52 PM
It's not meant to replace PC any time soon.
Maybe, eventually, it could replace part of a PC, some specific chips.

Oh I know, and the blog I linked is fully aware of that as well.  Aaronson's skeptical that the D-Wave machines are even capable of doing the limited set of tasks they should be able to do.

Duque de Bragança


frunk

Google, D-Wave, and the case of the factor-10^8 speedup for WHAT?

Long post, but here's some relevant bits:

QuoteYes, there's a factor-10^8 speedup that looks clearly asymptotic in nature, and there's also a factor-10^8 speedup over Quantum Monte Carlo. But the asymptotic speedup is only if you compare against simulated annealing, while the speedup over Quantum Monte Carlo is only constant-factor, not asymptotic. And in any case, both speedups disappear if you compare against other classical algorithms, like that of Alex Selby. Also, the constant-factor speedup probably has less to do with quantum mechanics than with the fact that D-Wave built extremely specialized hardware, which was then compared against a classical chip on the problem of simulating the specialized hardware itself (i.e., on Ising spin minimization instances with the topology of D-Wave's Chimera graph). Thus, while there's been genuine, interesting progress, it remains uncertain whether D-Wave's approach will lead to speedups over the best known classical algorithms, let alone to speedups over the best known classical algorithms that are also asymptotic or also of practical importance. Indeed, all of these points also remain uncertain for quantum annealing as a whole.

QuoteSo then, what do I say to Steve Jurvetson?  I say—happily, not grudgingly!—that the new Google paper provides the clearest demonstration so far of a D-Wave device's capabilities.  But then I remind him of all the worries the QC researchers had from the beginning about D-Wave's whole approach: the absence of error-correction; the restriction to finite-temperature quantum annealing (moreover, using "stoquastic Hamiltonians"), for which we lack clear evidence for a quantum speedup; the rush for more qubits rather than better qubits.  And I say: not only do all these worries remain in force, they've been thrown into sharper relief than ever, now that many of the side issues have been dealt with.  The D-Wave 2X is a remarkable piece of engineering.  If it's still not showing an asymptotic speedup over the best known classical algorithms—as the new Google paper clearly explains that it isn't—then the reasons are not boring or trivial ones.  Rather, they seem related to fundamental design choices that D-Wave made over a decade ago.

QuoteIn the meantime, while it's sometimes easy to forget during blog-discussions, the field of experimental quantum computing is a proper superset of D-Wave, and things have gotten tremendously more exciting on many fronts within the last year or two.  In particular, the group of John Martinis at Google (Martinis is one of the coauthors of the Google paper) now has superconducting qubits with orders of magnitude better coherence times than D-Wave's qubits, and has demonstrated rudimentary quantum error-correction on 9 of them.  They're now talking about scaling up to ~40 super-high-quality qubits with controllable couplings—not in the remote future, but in, like, the next few years.  If and when they achieve that, I'm extremely optimistic that they'll be able to show a clear quantum advantage for something (e.g., some BosonSampling-like sampling task), if not necessarily something of practical importance.  IBM Yorktown Heights, which I visited last week, is also working (with IARPA funding) on integrating superconducting qubits with many-microsecond coherence times.  Meanwhile, some of the top ion-trap groups, like Chris Monroe's at the University of Maryland, are talking similarly big about what they expect to be able to do soon. The "academic approach" to QC—which one could summarize as "understand the qubits, control them, keep them alive, and only then try to scale them up"—is finally bearing some juicy fruit.