IBM Q to 53, Google Quantum Advantage??

The race to “quantum advantage” heated up recently with what felt like a series of one-ups from IBM and Google.  IBM was first to break news with the announcement of an enhanced version of the IBM Q system from earlier this year – now with 53 qubits compared to the earlier and first system launch at 20.  At more than double the first IBM Q launch and exponential quantum gain by qubits, this is an enormous step in potential performance.  IBM noted this 53-qubit would be available online initially.

Shortly after the IBM announcement, there was an announcement that Google achieved “quantum advantage” in lab experiments.   (Quantum advantage is the concept that a quantum computer can solve a problem that a classical computer cannot, i.e. in a viable amount of time.).  The paper, published in Nature magazine, was title ‘Quantum Supremacy Using a Programmable Superconducting Processor’ can be found here.

As stated by Google Researchers, “To our knowledge, this experiment marks the first computation that can only be performed on a quantum processor.”

My first reaction was “Already?!”.   In the coursework and reading I’ve done thus far, most have projected quantum supremacy to occur at on the order 100 qubit systems, so how was this possible?  What problem had they solved?  According to the article, the use case involved “comparing [the] quantum processor against state-of-the-art classical computers in the task of sampling the output of pseudo-random quantum circuit.”  It appears the goal was to create and sample randomness across the set of output bits, which is cool but not immediately clear how it would solve a real world problem.   The quantum processor used was a 54-qubit system and was called “Sycamore.”

Interestingly, it didn’t take long for IBM to dispute Google’s results, and publicly at that.   Per IBM Research blog:

Google’s experiment is an excellent demonstration of the progress in superconducting-based quantum computing, showing state-of-the-art gate fidelities on a 53-qubit device, but it should not be viewed as proof that quantum computers are “supreme” over classical computers.

Ouch.  However, IBM’s analysis seems rather credible.  IBM points out the baseline algorithm referenced by Google was synthetic in nature, and given the extensive classical computing tools and resources available today, it is possible to beat Google’s quantum performance (or “quantum volume” as IBM may argue) using enhanced memory and storage techniques available to classic computers.

The important point here, which IBM did acknowledge, is that Google’s work does demonstrate a step forward in quantum computing.   I can’t help but wonder if it a coincidence that the quantum computer used by Google was based on 54 qubits, exactly 1 qubit more than announced by IBM just earlier?  IBM was forced to make the point in their article, acknowledging as such and ultimately giving credit for Google at least demonstrating progress in theoretical quantum computing technology.  As competitive as the quantum race has become, one wonders if “no press is bad press” in this situation, at a 1 qubit more, just enough for Google to smirk a little at IBM in appropriate forums.  Figure 1. Analysis of expected classical computing runtime vs circuit depth of “Google Sycamore Circuits”. The bottom (blue) line estimates the classical runtime for a 53-qubit processor (2.5 days for a circuit depth 20), and the upper line (orange) does so for a 54-qubit processor.

As a recap, here’s a quick scatter plot of the system announcements by IBM & Google – scatter plot for now until a better trend-line visual makes sense. I’ll work on keeping this up to date as new announcements occur.

 

Qubits by Company (Scatter-plot).png

Thanks again for reading!

Leave a comment