Quantum computing is expected to bring about a revolutionary impulse to high-performance computing. Just like a few other notable advances of modern physics, the idea of using quantum computers to simulate quantum systems that are too hard to calculate using conventional digital computers was proposed in 1981 by Richard Feynman (again…). A number of quantum algorithms have since been devised, also in domains distant from quantum physics, to solve technical problems believed to be classically hard to compute, such as the famous integer number factoring at the basis of cryptography. While for many years quantum computing has been advancing at a fast pace essentially as a theoretical science, recent years have witnessed a growing number of real machines capable of reproducing a quantum computation on a real physical device.

In an attempt to measure technology progress, the notion of quantum supremacy has thus been introduced, namely the proof that a quantum computer can perform a given calculation much faster than a classical supercomputer; or, according to other definitions, a calculation that the classical computer cannot do it in any “reasonable” amount of time. For the records, the choice of the words was accompanied by a short-lived debate, since a handful of academics stated in a letter to Nature that the term supremacy might be perceived as having overtones of violence, neo-colonialism and racism through its association with “white supremacy”. The writers noticed that similar “violent” language has crept into other branches of science as well, e.g. spaceflight, where such evocative terms as ‘conquest’, ‘colonization’ or ‘settlement’ should be contextualized against ongoing issues of neocolonialism. (While starting to get concerned about the violent use of daggers in matrix algebra, I leave to you readers any personal judgement about such a polemic. However, since we are talking quantum mechanics and its quirks, which according to some eminent colleagues could be explained by a theory of many universes, I sometimes wonder: how did we got stuck in this one…?)

By all means, any quantum computation is an instance of a physics experiment. However, at present such experiments are rather a proof that there are no fundamental physical limitations that prohibit a quantum speed-up in computing, whereas the actual computational problem solved is of little or no practical use. Much further work will be needed, and challenges to be overcome, in order for quantum computers to show true quantum advantage, i.e. becoming integral components of workflows for efficiently solving real-world problems. Notably, a key technological step will be their integration with the existing hardware and software infrastructure, as we have repeatedly seen in history that even the most disruptive technologies go through a transitional period in which old and new technologies coexist.

As we all know by now, the qubit works as the quantum analog of the classic computational bit, as it can have a measurement value of both 0 and 1, whereas the classical bit can only be measured either as 0 or 1. Mathematically, the qubit pure state can be represented as a two-component (Weyl) spinor; a logic gate operating on qubits is a quantum operator that can be written as a composition of SU(2) matrices; the change of the state of a qubit by applying a gate is then a unitary transformation by a matrix-vector product. Both qubits and gates can be practically realized by a variety of different physical analogs, assembled in a general-purpose quantum computer to directly exploit superposition, entanglement, and wave-function interference in order to perform a calculation. As it was first suggested by David Deutsch in 1985, just a few years after Feynman’s speech, such a universal quantum computer could in principle solve any computable problem with an exponential speed-up over classical computers and algorithms. Complete universality, however, would require to arrange a sufficient number of high-quality qubits in a functional architecture, programmable for any given problem. Some estimates quote 1 million qubits as both a practically reachable and practically useful target.

Up to this time, general-purpose quantum computers are most often based on ion traps or super conducting circuits. Apparently, the first demonstration of quantum supremacy was performed in 2018 on the general-purpose Sycamore processor with 54 superconducting programmable qubits, built by a team of Google engineers in Santa Barbara, CA [Nature 574, 505-510 (2019)]. The whole chip including connections occupies just a few cm2 and is hosted in a tall cylindrical container cooled to 20 mK, for the Cooper pairs in each qubit to flow without resistance for the about 100 microseconds of coherence time. The qubits are arranged in a rectangular 9×6 Ising layout, with gates operating either on single qubits, or pairs of neighbor qubits. In superconducting quantum computers, quantum gates are implemented with microwave pulses with different durations, to fill the ground or first excited level of the Cooper pair (equivalent to a 0/1, or spin-up/down). In Sycamore, a single-qubit gate is realized by driving 25-ns pulses resonant with either qubit level and the qubit-qubit coupling turned off; two-qubit entangling gates are obtained by turning the neighboring qubits on and sending a 20-MHz coupling for 12 ns, which allows the qubits to swap excitations. In the practical implementation, gates are randomly assigned M times to each of the N qubits with a repetitive pattern to generate a sequence of random circuits, whose output is then sampled to produce a probability output distribution. The bit strings in the output are pseudo-random, because quantum interference will make certain strings to be more likely. Note that this is a transposition of the same principle behind the classical double-slit experiment at the basis of the particle-wave duality.

To establish the claim of supremacy, they attempted a corresponding classical calculation on the Juwels  supercomputer in Jülich, with 100,000 cores and 250 TBytes of memory. The memory is here the practical limit, since it allows to store the state vectors up to N=43 qubits, beyond which the amount of memory diverges. By using a different algorithm, more memory efficient but with worst scaling performance, they could also simulate on the Summit supercomputer of Oak Ridge (250 PBytes memory) up to M=20 layers. By extrapolation, Google engineers estimated that a sampling that required about 200 seconds on Sycamore would take order of 10,000 years on a million-core supercomputer. However, their suprematist claim was questioned a few months later by a team of IBM engineers (whose best quantum computer Q-system One is also based on superconducting qubits) who devised a much faster classical algorithm, notably requiring a constant amount of memory for any problem size, based on which they predicted (yet without performing it) that the calculation would get down to about 2-3 days.

Last week’s PRL published two striking, sequential papers by the Chinese Heifei University of Science and Technology group, led by J.-W. Pan (PRL 127, 180501 and 180502; notably, they use the wording “quantum advantage” in the title, and “quantum supremacy” in the text, whereas the related APS viewpoint by Barry Sanders – https://physics.aps.org/articles/v14/147 – uses “quantum primacy”). In the first paper they present a new quantum computer called Zuchongzi, with 66 superconducting qubits arranged in a 11×6 Ising lattice, that is a close-by copy of Google’s Sycamore. Just like Sycamore and IBM’s Q-system, it uses transmon qubits (an improved Cooper-pair box whose original concept was designed at CEA Gif-sur Yvette, see Phys. Scripta 1998, 165 (1998)). They repeated Google’s random circuit sampling experiment with N=56 qubits in M=20 cycles. Heifei’s result could look like just a small increase over Google’s 53-qubit claim of quantum supremacy. However, if you are familiar with Hilbert spaces you realize that going from 53 to 56 increases by eight the dimensionality, largely increasing the demand of classical computational resources: their comparative estimate, even using the better algorithm proposed by IBM, is of more than 8 years on a conventional supercomputer. 

The second, even more striking paper by Pan & co. is a confirmation of their impressive feat of realizing two quantum computers with two different technologies less than one year apart. The group had published in November 2020 a photonic implementation of the boson sampling model [Science 370 1460–1463 (2020)]. This new PRL paper increases that already excellent result by several orders of magnitude. If we postpone error correction and other issues, photon-based quantum computing could have an edge over other existing technologies, two main advantages being that it works at room temperature, and the qubits have very long decoherence times. Using only linear optical elements (mirrors, beam splitters and phase shifters) any arbitrary 1-qubit unitary operation, or equivalent quantum gate, can be reproduced. This concept was proposed in 2000 by E. Knill, R. Laflamme and G. J. Milburn, hence the acronym of “KLM protocol” for the earliest version of the device, which subsequently has gone through a number of improvements and extensions. The flip side of the coin is that a photon-based implementation is not a very compact architecture, and an assembly of just a few photon qubits requires a big room. However, there are already attempts at designing photonic quantum microcircuits, such as the ones proposed by the Canadian startup Xanadu.

The boson random sampling experiment was devised in 2011 by two MIT computer scientists, Scott Aaronson and Alex Arkhipov. It entails calculating the probability distribution of many bosons (notably, photons) whose quantum waves interfere with one another in a way that randomizes their positions. The probability of detecting a photon at a given position behind, e.g., a diffraction grating could be calculated in principle, but it involves practically intractable matrices. Such a calculation is defined as a #P-hard problem, the number of solutions increasing exponentially with the number of variables. For a set of regular bosons, the matrix operator is the Permanent, a kind of determinant where you forget to apply the sign changes; for Gaussian-shaped photons, it is a weird operator called Torontonian  (because the proponents were from Toronto, Canada), which amounts to compute 2N determinants of a 2Nx2N matrix. Already for a few tens of bosons, Aaronson and Arkhipov showed that there is no classical shortcut for these impossibly long calculations. In their first Science paper, the Heifei group used N=50 indistinguishable single-mode Gaussian squeezed states (generated by a multiply split laser beam) as input, injected into a 100-mode ultralow-loss interferometer to produce entangled states, and sampled the output using 100 high-efficiency single-photon detectors. Note that the unitary transformation that entangles the whole 50 photons is a 100×100 matrix, whose “Torontonian” requires to compute about 1015 determinants 100×100: even if each determinant would take 1 second (ehm…) the calculation on a classical single CPU would require about 32 million years… By obtaining up to 76-photon coincidence in about 200 seconds, they measured a sampling rate about 1014-fold faster than using state-of-the-art classical simulation strategies and supercomputers. In the recent PRL paper, the interferometer was upscaled to a record 144 modes, giving up to 113 photon coincidence events, which increased the Hilbert space dimension to 2144~1043 and the sampling advantage with respect to a classical computer to about 1024. The key here is that there is no classical algorithm that can be practically used to verify this calculation, therefore the “supremacy” is undisputed.

Have we then entered the new age of quantum technology domination? Are we going to see more and more quantum devices penetrating our daily life such as the latest “time crystal”, with all of their quirks of uncertainty, entanglement, teleportation and the like? Of course, the most cited example of the “weirdness” of quantum mechanics is the famous Schrödinger paradox of the cat that would be simultaneously alive and dead. The “paradox” has indeed a direct connection with the notion that the state of a quantum bit remains undetermined until measured. Hundreds of pages, entire books have been written on this subject, trying to explain the paradox in simple, yet logical terms. I wonder whether they ever succeeded, for the layman and laywoman to grasp at least some idea of what the paradox means. I even doubt a consensus explanation is agreed upon, by my respectable colleagues. I can give you my own view, though. In one of the last scenes of The Good, the Bad and the Ugly by Sergio Leone, Clint Blondie Eastwood and Eli Tuco Wallach are seen in a cemetery, digging out a bag with 200,000$ worth of Confederate gold hidden in a grave. The extreme realism of Sergio Leone required the use of a real skeleton to be put in the grave, which was found in a singular way: the bones that are shown next to the two actors belonged to a former Spanish actress, who had asked in her last will to be able to continue her career also after her death. 

Now think about it… was she dead and/or alive in that scene?

Quantum supremacy is not a bad word

Post navigation


Leave a Reply

Your email address will not be published. Required fields are marked *