Quantum Algorithms Outlook 2020

Contrary to many reports there are set be immediate applications even for early devices. Some will be controversial. However in other areas timelines look over-hyped. 2020 will be a key year to prove the naysayers wrong.

Will clients see early value?

Despite all the fuss, quantum processors actually run more slowly than conventional processors. It’s only the unique quantum algorithms they support that promise significant advantages.

Since the 1990’s, quantum algorithms have been developed and adapted to address a wide range of problems. When combined with powerful enough hardware they typically promise significant (quadratic) speedups, and in some cases a remarkable (exponential) ones.

Much work in 2019 has focused on separating what requires large scale FTQC versus where quantum advantage might be achieved with early NISQ devices. The most promising ideas for the next 2-5 years are typically hybrid classical-quantum heuristic approaches requiring trial and error to develop.

For an overview of recent work read Quantum Software – over-hyped but underestimated.

Intriguingly, early movers are showing that in some cases a ‘quantum-inspired’ approach is all it takes even on pre-supremacy and conventional hardware.

Cryptography – already something new?

Much comment following Google’s quantum supremacy announcement welcomed the achievement of an engineering milestone, but concluded that ‘of course there is no immediate application’.

Scott Aaronson at Q2B

Jessica Pointing (Stanford) interviews Scott Aaronson (UT at Austin) at Q2B

It is important to realise that this is not what John Martinis (Google) is saying or what Scott Aaronson (UT at Austin) is saying. Both point to the potential for Sycamore-era devices to provide provably random numbers (aka certified public random numbers). This is different to the service QRNG devices offer. QRNGs are useful, however we must still trust the manufacturer to have built a honest device (perhaps supported by assurance testing and certification) and we must still trust its the operator not to cheat.

Aaronson’s radically new proposal uses a variation of the Google supremacy algorithm to allow us to prove that the random numbers produced are from a verifiably quantum distribution. Crucially this can be done remotely and openly. This allows new applications where we need to certify publically (without having to trust any central party) that a number is random. Modest use cases would appear to include lotteries and audit. The potential application of this technique in proof-of-stake cryptocurrencies could be much more significant.

The Blockchain Trilemma – Bitcoin hasn’t taken over the world because in the end the proof-of-work consensus algorithm on which it depends does not scale well and leads to transaction settlement times and costs inconsistent with mass adoption (the environmental impact of its excessive energy use is also obscene). Proof-of-stake is a conceptually attractive alternative approach. However practical implementations have always had to compromise across a ‘trilemma’ of security, scalability and decentralisation challenges. Facebook’s controversial Libra cryptocurrency proposes a ‘permissioned’ proof-of-stake scheme. However, ‘permissioned’ means that there is inevitably a central group of operators that, unlike Bitcoin miners, cannot easily escape regulatory oversight; hence the successful political pressure on early Libra backers to withdraw. The underlying purpose of these blockchain consensus protocols is simply to randomly determine who has the right to add the next block of transactions to the distributed ledger. Publically provably random numbers are potentially the key resource missing to allow a truly streamlined permissionless scheme.

It has been common for quantum gurus to muse that ‘the most important applications of quantum computers will be ones that nobody has thought of yet’. They will feel vindicated if the first significant application turns out to be one that literally no one had thought of until Google started messing around with random quantum circuits.

Quantum Simulation – the killer app

The ability to perform calculations in quantum chemistry, physics and materials design is the original, and for many still the slam dunk killer application for quantum computers. This may sound niche to the lay person but it shouldn’t. Chemistry can be dull, but it’s vitally important for almost every aspect of the modern world, from pharmaceuticals to industrial processes to renewables technologies. The only reason we don’t routinely use such quantum chemistry calculations is that their native quantum-complexity is beyond conventional computers.

Ryan Babbush, Google

Ryan Babbush, Google

In 2019 Google presented detailed work on what it would take to perform calculations on a large molecule of significant interest. FeMoco (Fe7MoS9C) is the active site within the Nitrogenase enzyme responsible in nature for the catalytic conversion of nitrogen gas into ammonia (fertilizer). If we could better understand FeMoco then potentially we could improve the energy intensive Haber process used for the industrial production of fertilizer. This would not just be a commercial breakthrough, but also one with very positive implications for feeding the planet and limiting climate change. Google’s work currently shows how an error corrected device of about 1 million qubits could be used to perform relevant calculations. Unfortunately, even if further progress brings that down to 100,000 qubits, this approach still requires FTQC.

It’s no surprise that much work is going into what might be achieved in the NISQ era. Promising techniques pursued in 2019 included the hybrid quantum-classical VQE; error mitigation (e.g. measuring and extrapolating out the effect of errors); and stabiliser techniques (e.g. spotting errors that violate physics in the system being simulated).

Variational Quantum Eigensolver (VQE) is a technique that can be used to estimate molecular ground state energies. Various groups are working to extend this to excited states, opening up applications in predicting optical spectra and reaction rates [49]. Riverlane have extended VQE into a Vibrational Quantum Deflation algorithm for excited state calculations. CQC have shown how symmetries can be exploited to allow calculations for the CH2 radical. IBM has leveraged simplifications inspired by conventional computational chemistry techniques to execute trial calculations for the LiO2 dimer.

As things stand, commercially useful calculations in quantum chemistry and materials design applications are a possibility but not a certainty in the NISQ era.

Scenario Simulation – a flexible analytic tool

Monte Carlo simulation methods are used frequently in scientific and analytic applications and in particular in the Financial Services industry.

Quantum Amplitude Estimator (QAE) can be used to provide a (quadratic) speedup over traditional techniques of Monte Carlo simulation. In 2019 IBM demonstrated how QAE can be used to execute common financial portfolio tasks including pricing a security and calculating the common risk metrics VaR and CVaR [83]. Impressively the pricing demonstration was run on a real 5-qubit Yorktown processor. While not yet out-performing conventional processing, this experiment does point to specific applications that businesses will readily understand.

Again realisable benefits are not yet proven, but they may be accessible with improved NISQ devices.

Bob Sutor (IBM) comments “This is a great time for businesses to get quantum ready. Work with your team to identify challenges in your current business that depend on known bottlenecks like Monte Carlo simulation. Let your team learn how quantum helps and build the know-how to seize quantum advantage as improved hardware becomes available.”

Optimisation – a wide area of opportunity

Optimisation problems are a very general area where quantum algorithms promise significant speedups (though typically quadratic rather than exponential ones). Interest in this area is strong because of the obvious fit with a very wide range of business problems, from logistics to portfolio management, from feedstock planning to managing telecoms networks.

Several well understood algorithms promise wide applicability algorithms (e.g. Amplitude Amplification including Grover’s algorithm). Unfortunately these require that the input data is stored in QRAM. A further complication is that this threatens to introduce non-trivial input/output bottlenecks. Little commercial activity has yet addressed the challenge of engineering QRAM. Most place it on a timeline similar to FTQC.

Again hybrid quantum-classical approaches are being considered for the NISQ era. Promising techniques pursued in 2019 included Quantum Annealing (QA) and the Quantum Approximate Optimisation Algorithm (QAOA).

Quantum Annealing (QA) seeks to solve a particular class of (binary) optimisation problems. The D-Wave System 2000Q uses this approach for optimisation problems and also for many other problems that can be mapped onto this model. Some pilot studies such as the recent successful field trial by Volkswagen routing buses through Lisbon traffic look near to routine operational deployment [73]. The success of this approach has spurred competition from quantum-inspired conventional systems such as Fujitsu’s Digital Annealer and Toshiba’s SBM.

Quantum Approximate Optimisation Algorithm (QAOA) is a technique designed to tackle (binary) optimisation on gate-model NISQ devices. Recently IBM and Barclays have collaborated to show how QAOA can be generalised to problems that include binary variables, continuous variables and constraints such as inequalities [84]. This potentially opens up a wider range of business problems. The fascinating example identified by Barclays is the back office financial services task of optimising financial settlements processing.

Clients already seem to be benefiting just from the challenge of thinking about their optimisation problems in new ways. Whether we will see provable quantum hardware advantage in the NISQ era is again not a certainty.

Machine learning – hot but unproven

Quantum machine learning is considered by many to be a major opportunity area. However again, the most often cited algorithms (e.g. HHL) require the availability of QRAM and so are probably out of immediate reach.

The opportunities currently most discussed for NISQ era applications are hybrid quantum-classical training, particularly where it might be beneficial to recognise uniquely quantum correlations in datasets. Where might such datasets be found? Intriguingly this may be just the challenge we face when working with sampled output from variational algorithms such as VQE, or in controlling and interpreting the output of quantum sensors.

In 2019, Rigetti have emphasised their strategy of providing an execution environment specially tailored for iterative execution of hybrid quantum-classical variational algorithms on near term devices.

Variational Quantum Linear Solver (VQLS) is a new algorithm for solving linear systems (a common task in machine learning) developed by members of the Rigetti community.

Rigetti have also been working with reinforcement learning to train an AI to create quantum programmes for other applications such as combinatorial optimisation. Trials with an Aspen quantum processor have demonstrated variational quantum programmes that are shorter and more robust to noise.

There are many exciting ideas in quantum machine learning. However, the extent to which value can be delivered by NISQ devices is not proven.

Quantum winter

FTQC certainly promises remarkable business and societal value. Some quantum-inspired value is already being seen by early adopters now. At least some early quantum applications seem set to follow. However there remains a danger that in the context of the hype the actual bottom-line value delivered in the NISQ era will disappoint.

This could be a long wait. Some have speculated that the sector will have to endure a quantum winter while early investors wait to see returns. In the absence of progress such a chill could suddenly make it difficult for emerging players to secure seed and follow-on investment.

To watch in 2020

Provably Random Numbers – Google has bought the IP rights for Aaronson’s (still unpublished) protocol on a non-exclusive basis. Will Google use one of its new quantum devices to launch a remotely certifiable public randomness service? Unlike the NIST randomness beacon, we wouldn’t have to trust the operator.

Proof-of-quantum blockchain – Will someone work-up Provable Random Numbers into a fully-fledged proof-of-stake blockchain proposal? Cryptocurrencies are a potential rival to sovereign fiat currencies and people of good conscience differ markedly on whether that is a good or a bad thing. Expect controversy.

Hybrid quantum-classical heuristics – Variational techniques such as VQE, QAE and QAOA show promise. Expect to see algorithms being demonstrated on real devices for increasingly non-trivial cases. Watch out for specific indications of the scale of calculation and required NISQ device specs to address commercially relevant problems.

Quantum annealing – D-Wave strategy has a characteristic focus on user-led trials, now standing at 200+ and with several now near production-ready: traffic routing optimisation (Volkswagen), factory automation routing (Denso), hotel recommendations (Recruit Group). D-Wave Leap is in poll position to be the first quantum cloud service to host a quantum algorithm in routine operational business use.

Quantum machine learning – Like conventional machine learning this field will likely proceed very much through trial and error. Watch out for experimentation with increasingly powerful early processors and environments making a difference.

Quantum memory – This key part of the quantum hardware architecture has so far seen little open commercial activity. Startups Bra-Ket and ColdQuanta have been associated with aspirations in this area. Could we see an announcement that puts this, and the algorithms that depend on it, on a firm roadmap?

Computational complexity frontier – This remains a notoriously difficult area. Recent claims of a proof to the long standing Aaronson-Ambainis conjecture, that [exponential] quantum speedups require structure, have been withdrawn. Research into computational complexity remains very active. Any big breakthrough on a foundational question such as BQP vs NP-complete, BQP vs NP∩co-NP, or even P vs NP itself might baffle non-experts but have big implications for the quantum sector.

OverviewNext

David Shaw

About the Author

David Shaw has worked extensively in consulting, market analysis & advisory businesses across a wide range of sectors including Technology, Healthcare, Energy and Financial Services. He has held a number of senior executive roles in public and private companies. David studied Physics at Balliol College, Oxford and has a PhD in Particle Physics from UCL. He is a member of the Institute of Physics. Follow David on Twitter and LinkedIn

Leave Comment