Looking beyond quantum supremacy

Striking progress modelling molecular vibrations with a quantum simulator brings the abstract quest for quantum supremacy into context. Is this a pointer to medium term commercial applications?

The world is waiting for a first demonstration of quantum supremacy,  a computation not practical on any conventional computer. However behind that headline will lie a number of questions, not least what does it tell us about the timeline for commercially useful applications to emerge? Many of the revolutionary benefits of quantum information processing require full fault tolerant quantum computing (FTQC), but that is still many years away. What might be possible during the era of noisy intermediate-scale quantum (NISQ) technology?

True quantum supremacy

Headlines often focus on quantum speed comparisons with the best known equivalent algorithm running on a conventional supercomputer (Fact Based Insight calls this quantum advantage). However this obscures the key point that quantum computers are running completely new quantum algorithms unavailable to conventional computers.  For true quantum supremacy theorists also require a mathematical proof that the existence of an efficient conventional algorithm would contradict other widely believed mathematical results (specifically the ‘collapse of the polynomial hierarchy’). A conventional computer can never catch-up.

When Caltech professor John Preskill coined the term in 2012 there was already a realisation that a demonstration of quantum supremacy might not have to wait for FTQC to be perfected. Instead a range of different algorithms for intermediate approaches have been proposed including: boson sampling; instantaneous quantum polynomial-time (IQP) circuits; quantum approximate optimisation algorithm (QAOA) circuits; random quantum circuits; quantum annealing and others.

These differ in terms of the hardware they require and the tightness of the mathematical supremacy proof they offer. Perhaps more importantly they also differ in terms of the future uses and applications they point towards. While one approach may claim the glory of being the first, a long road remains.

This journey is not without risk. Few quantum scientists doubt the long term transformative worth of quantum technology. However many harbour concerns that short term hype will lead to a backlash of disillusionment if expectations for progress are not reasonably managed (20).

Google Bristlecone and random quantum circuits

Google Bristlecone

Bristlecone Quantum Processor Credit: Google

Many expect that Google will announce an initial demonstration of quantum supremacy based on their new Bristlecone processor and the task of simulating a configuration of random quantum circuits they have defined (21):

  1. Take a 2D array of qubits (e.g. 7×7 or 6×12 lattice) with nearest neighbour connectivity
  2. Apply a Hadamard gate to each qubit to create a superposition
  3. Apply two qubit (CZ) gates to entangle neighbouring qubits
  4. Apply a randomly chosen gate (T, X1/2 or Y1/2) to each qubit
  5. Repeat steps 3 & 4 to add layers of depth to the circuit

The best known conventional simulation is exponentially hard in the number of qubits and circuit depth. For a 7×7 lattice, a depth of 40 layers of gates is anticipated to be sufficient to demonstrate quantum supremacy. A ‘cross entropy’ benchmark test is proposed to establish the validity of the output.

This algorithm is suited to the gate-model superconducting qubit hardware that Google is developing. The theoretical and quantitative work they have already prepared allows them to present their proposed calculation as a true demonstration of quantum supremacy (24). They also make a strong defence that this is not just a piece of PR, but a very legitimate engineering milestone that shows they really can operate their device at the required demanding levels of coherence and fidelity.  When achieved, it will rightly be celebrated as a landmark scientific achievement on one of the mainstream roads towards FTQC.

There is a danger however that the public, or even educated business opinion, will fail to engage with the nature of the random circuits problem. Simulating such an output appears to be very abstract and far from any useful application. Pointing out that this is similar to a laser speckle pattern may intrigue physicists, but won’t be meaningful to most.

Boson sampling revisited

Integrated phontonics chip

Integrated photonics chip Credit: N. Matsuda at NTT

Boson sampling is another much discussed approach to demonstrating quantum supremacy (in fact the original 2010 proposal of Aaronson & Arkhipov predates the coining of the term).  Our ability to control individual photons is one of the key enabling technologies that has kick started the second quantum revolution. It was therefore natural that an early idea to demonstrate quantum supremacy should be based on photonic hardware (though proposals do exist to implement boson sampling with other quantum technologies).

The algorithm depends on measuring (sampling) the statistical distribution of the output produced when multiple indistinguishable photons move through a network of optical pathways where they interfere quantum mechanically. Computing the final pattern of photons expected to emerge is intractable for a conventional computer.

Much of the academic literature deals with the hard, dry task of proving the mathematical basis on which such a linear optical processor can claim quantum supremacy. However it has proved challenging to implement a device of the scale required for such a demonstration. Photonics is currently benefiting from a new generation of improved single photon sources, single photon detectors and the silicon based fabrication of miniaturised optical chips. This is revolutionising our ability to design programmable devices of greater power. However the required 50-90 photon scale required for demonstration of quantum supremacy remains tantalisingly beyond the capability of the current generation of devices. In particular further progress is required to reduce the impact of photon loss.

Narrowly defined, boson sampling may seem like just another artificial problem. However it has recently been shown that strong links exist to very real applications with industrial relevance.

Bosons along with fermions are the two basic classifications of any type of fundamental particle. Multiple fermions cannot occupy the same quantum state at the same time (which is why everyday objects have substance: cups don’t fall through tables). Bosons on the other hand are allowed to pile up together (this is the basis of how a laser works).

Importantly many other quantum systems of interest are often analysed in terms of emergent quanta (quasiparticles) that share the statistical properties of bosons. Examples include high temperature superconductors, super fluids and molecular vibrations. A recent set of results from Anthony Laing’s group at the University of Bristol, published in Nature (Sparrow, Martín-Lopéz, et al. 25 ) illustrates just how interesting these applications may be.

Molecular quantum vibrations

The idea to use tuned lasers to control chemical reactions by selectively breaking chemical bonds is a long standing one (i.e. shake the molecule in just the right way and it will fall apart at exactly the place you want). A successful technique would have numerous potential applications from cheaply catalysing industrial chemical reactions to precision preparation of biopharmaceuticals.

However putting this into practice has been difficult. In practice, energy from a laser pulse spreads rapidly as vibrational energy across the targeted molecule (it falls apart, but not necessarily at the place you want). Quantum superposition and indeterminacy make this process impossible to model conventionally beyond simple cases and approximations.  Active control of precisely how the molecule is excited over multiple calibrated pulses is required on ultra-fast timescales. Only limited progress has been possible using adaptive feedback control (trial-and-error) (26). Furthermore, as quantum control techniques advance it may be possible to use isolated quantum states of light for molecular control. The task of modelling these interactions is even more intractable to conventional computers.

In their recent paper, Sparrow, Martín-Lopéz, et al. (25) show how their linear optical processor can be used to simulate a selection of illustrative cases:

  • Vibrational dynamics for four-atom molecules – including H2CS, SO3, HNCO,  HFHF, N4 and P4
  • Energy transfer in the biomolecule N-methylacetamide (NMA), the simplest molecular model of the peptide bond in proteins
  • Vibrational relaxation in water (H2O)
  • Identification of dissociative pathways for ammonia (NH3) to augment adaptive feedback control
Modelling SO3 and NHCO

Alternative vibrational modes (blue & black) Credit: Sparrow, Martín-Lopéz, et al.

The molecular vibrations obey the same bosonic statistics as the photons in the linear optical processor. This is an example of an analogue quantum simulator – a device set-up as a direct model of the system under study. While any FTQC device (when available) could do this calculation, a simulator has the advantage in the short and medium term if it can exploit a similarity in the underlying physics or mathematics of the problem.

This is not quantum supremacy. At this scale the modelled calculations can be reproduced conventionally.  However these results are a powerful indication of potential future applications.

Actions for Business

Quantum supremacy is not a single milestone, but one that will be repeated on multiple parallel pathways.  Many approaches are pursuing the same ultimate goal of FTQC, but the short and medium term possibilities they support will be different. Those seeking early commercial applications should pay attention to where particular technologies and device configurations have short and medium term advantages in the niche they are pursuing. In particular, quantum simulators could have a big impact during the NISQ era. Fact Based Insight believes that their importance is currently often overlooked in popular coverage.

 

 

David Shaw

About the Author

David Shaw has worked extensively in consulting, market analysis & advisory businesses across a wide range of sectors including Technology, Healthcare, Energy and Financial Services. He has held a number of senior executive roles in public and private companies. David studied Physics at Balliol College, Oxford and has a PhD in Particle Physics from UCL. He is a member of the Institute of Physics. Follow David on Twitter and LinkedIn