Quantum software – over-hyped but underestimated

The advantages a quantum computer can offer depend on the quantum algorithms it can run and also how easily we can get data in and out of the system. Today’s hardware is still well short of the universal quantum dream and, in truth, the most often discussed benefits are still many years away. However the promise remains revolutionary and commercial activity is already gathering pace across the many types of software that will be required to run these machines.

It has become common for quantum computing startups to describe themselves as ‘full stack’ quantum computing companies. This means that in addition to the hardware, they are also working to develop the software layers required to execute a program end-to-end:

Applications – Programs that use one or more quantum algorithms to implement the required calculation. Often groups by the nature of the underlying computation being performed: Cryptography, Simulation, Optimisation or Machine Learning.

Languages & Libraries – Programming languages, optimising compilers and support libraries that make it easier to implement standard quantum algorithms.

Error Correction – ultimately to move beyond the NISQ ear to that of FTQC, real devices must implement error correcting protocols. These ‘codes’ are mathematical in nature and form a unique feature of the quantum software stack.

Device Control – optimised signal routines that directly control the quantum device. This is likely to be a key driver of high fidelity quantum operation for the foreseeable future.

Richard Joza (Univ. of Cambridge), Steve Brierley (Riverlane), Noah Linden (Univ. of Bristol), Ashley Montanaro (Univ. of Bristol)

Richard Joza (Univ. of Cambridge), Steve Brierley (Riverlane), Noah Linden (Univ. of Bristol), Ashley Montanaro (Univ. of Bristol)

The recent, inaugural Quantum Computing Theory in Practice (QCTIP) conference, attracting leading speakers from around the world, was an opportunity to understand the state of the art across these areas [49].  It is taking more than just one discipline to drive this field forward, Ashley Montanaro of the organising committee was  particularly pleased to point to the “even mix of physicists, mathematicians, computer scientists and innovators” among the conference’s 150 delegates. The strong US quantum information community was represented in force.

Iordanis Kerenidis has memorably described quantum machine learning as “over-hyped but underestimated”. Fact Based Insight thinks that this sentiment could extend across the full quantum software stack.

Applications

Cryptography

Eleni Diamanti, CNRS

Eleni Diamanti, CNRS

The ability of future quantum computers to break many of the security protocols on which the Internet currently depends is (unfortunately) the first application of quantum computers of which most people hear.

The Quantum Fourier Transform is a key underlying quantum primitive, and is used in Shor’s algorithm to solve a special case of the hidden subgroup problem. This allows (almost) exponentially faster solutions of integer factorisation and discrete logarithm problems. Current public key cryptographic protocols are based on the assumed hardness of these problems.

Hartmut Neven (Google) told QCTIP about Google’s latest estimate of what a ‘realistic’ machine might achieve: an RSA 2028 bit key could be broken in 7 hours using a 23 million superconducting qubit machine. Fortunately this is still a long way beyond the capabilities of any current device. The world can easily adapt to this challenge as long as it adopts quantum safe cryptography in good time.

NISQ cryptography?

Google clearly believes it is now close to the more immediate goal of demonstrating quantum supremacy. For some time it had seemed that though this would be a milestone scientific achievement, the obscure nature of the random quantum circuits problem being used to demonstrate supremacy would be of no commercial relevance.

Remarkably Google stands ready to crown their achievement by turning it into a genuine (though modest) commercial service: the provision of certifiable public random numbers. This is a success at many levels and getting Scott Aaronson’s stamp of approval on the new offering is the quantum information equivalent of getting the Pope (or at least a high ranking cardinal) to officiate at the christening.

Simulation

The founding conception for quantum computers was the desire to be able to efficiently simulate physical systems that rely on the laws of quantum mechanics (e.g. in chemistry, materials science or fundamental physics).

“Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.”  Richard Feynman, 1981

Hamiltonian simulation is a natural quantum primitive. In quantum mechanics, the Hamiltonian describes the physical properties of the system (c.f. the equations of motion in classical physics).  It’s widely believed (though not strictly proven) that if you are trying to simulate a physical system that naturally incorporates the exponential complexity of superposition and entanglement, then your computer hardware had better also have those resources.

Andrew Childs, Univ. of Maryland

Andrew Childs, Univ. of Maryland

Andrew Childs (QuICS) took QCTIP through the formal quantum algorithms that are set to drive this area in the long term. A range of techniques such as Product Formulas (Trotter Steps), Taylor Series and Quantum Signal Processing (Qubitization) are increasingly well understood and promise ground breaking results. The commercial impact in industrial sectors built on research in materials science or where chemistry is key such as pharmaceuticals could be profound.

Ryan Babbush, Google

Ryan Babbush, Google

As an example, take FeMoco (Fe7MoS9C) the active site within the Nitrogenase enzyme responsible in nature for the catalytic conversion of nitrogen gas into ammonia (fertilizer).  If we could better understand FeMoco then potentially we could improve the energy intensive Haber process used for the industrial production of fertilizer. This would not just be a commercial breakthrough, but also one with very positive implications for feeding the planet and limiting climate change. Ryan Babbush (Google) took QCTIP through a detailed scheme for modelling FeMoco on a scaled-up quantum computer.  The key bottleneck step in this algorithm (creating magic states – see below) would require a 1 million qubit device running for one month.

Such a ‘small’ fault tolerant quantum computer should be available in advance of the ‘large’ fault tolerant quantum computers often referenced in popular discussion.  However even such devices are still a long way off. Babbush makes reasonable assumptions about the target specs of future machines, but the assumed gate fidelities are above those yet achieved even with small superconducting qubit devices, while the gate speeds are far faster than those achieved with trapped ion devices. Fabrication of devices and control environments for devices of this scale are a challenge that the field is only beginning to explore.

NISQ simulation?

Steve Brierley, CEO of Riverlane, likes to joke “quantum software can be a bit like rushing out to buy the latest video game, but without checking the packaging to see what specs are required to run it”. Riverlane are focussed on developing quantum algorithms for new materials, catalysts and ultimately pharmaceuticals discovery. At QCTIP they and others presented work on algorithms tailored to what might be achieved on the much more modest NISQ era machines. Promising techniques include hybrid classical-quantum approaches such as the Variational Quantum Eigensolver (VQE); extentions of VQE for excited states; error mitigation (e.g. measuring and extrapolating out the effect of errors); and stabiliser techniques (e.g. spotting errors that violate physics in the system being simulated).

These computational methods are not the only possible avenues for commercially relevant progress in the NISQ era. The presence of Anthony Laing (Univ. of Bristol) at QCTIP and the presentation by Juani Bermejo-Vega (Freie Univ. Berlin) remind us of the promise of analogue quantum simulators. This area has received less popular attention (perhaps because it is further from the traditional computing paradigm) but real devices have already produced striking results [25].

Optimisation

Real world decisions often entail trading off projected benefits and costs across complicated data and constraints. Mathematically these are known as optimisation problems. This type of problem is ubiquitous in business:

  • sales & operations planning, detailed scheduling
  • maintenance scheduling, dispatching & routing
  • financial portfolio management, trade settlement, risk management and pricing
  • network demand response
  • inventory & supply chain management
  • campaign management & ad placement

Quantum computing looks set to make a major contribution here, though the outlook is nuanced.

Amplitude Amplification, of which Grover’s search algorithm is one variation, is a key underlying quantum primitive. It offers a quadratic speedup over conventional algorithms with wide potential applications.

Ronald de Wolf, CWI; Joran van Apeldoorn, CWI

Ronald de Wolf & Joran van Apeldoorn (CWI)

Ronald de Wolf took QCTIP through the main algorithms that quantum computing brings to optimisation problems. Techniques address both discrete optimisation problems (shortest path, max-flow in a network, satisfiability) and continuous optimisation problems (gradient descent, linear programming and its generalisation semidefinite programming).

The Brandão-Svore algorithm for solving semidefinite programs is a good example of speedups that can be achieved by exploiting the correspondence of an abstract mathematical problem to the natural maths of quantum Gibbs states.

Simon Benjamin, Oxford Univ.

Simon Benjamin, Oxford Univ.

Overall the speedups we currently understand are at best quadratic (rather than exponential). However, given the wide commercial applicability of these problems this potential advantage is of major significance. We are even able to offer some speedup for some NP-hard discrete optimisation problems such as the 3-SAT variation of the famous ‘satisfiability’ problem. Continuous optimisation is also notable for its importance in many machine learning applications. Work continues to identify further speedups.

There is however an important catch to implementing these techniques. The data being processed is assumed to be stored in QRAM. Conventional data is written into this structure and can then be retrieved and queried in quantum superposition. The problem is implementing such a structure on noisy hardware and without introducing a processing overhead that would counteract the quantum advantage being sought. Relatively little experimental work has been done on QRAM and most assume that it will not be available for any significant scale of data in the NISQ era.

NISQ optimisation?

An immediate opportunity in the NISQ era is to seek particular problems that don’t require lots of conventional data as an input (e.g. the required data can be specified as a formula, or is a quantum sub-routine of a larger problem).

Again hybrid classical-quantum approaches are being investigated. The Quantum Approximate Optimisation Algorithm (QAOA) is a leading example (emphasised for example by QC Ware at the recent Q2B conference). Speedups for such heuristic approaches are unproven and need investigation by trial and error.

Another example of a potential heuristic approach is Quantum Annealing, either in algorithmic form or as implemented by a dedicated device such those made by D-Wave Systems.  Theoretical speedups are unproven, but hardware continues to improve and early adopters are learning useful lessons on how to approach their business optimisation problems in new ways.

Machine Learning

Aram Harrow, MIT

Aram Harrow, MIT

Conventionally powered AI is already a major driver of today’s tech industry. It is no surprise that the quantum majors have sought to emphasise the potential of quantum computing and in particular quantum machine learning to power this area.

The underlying toolbox for quantum machine learning includes the Harrow Hassidim Lloyd (HHL) algorithm for solving linear equations, Quantum Distance Estimation for data clustering and Quantum Tomography for efficiently extracting classical data from a quantum state.

Aram Harrow (MIT) also spoke at QCTIP. HHL is a good example of the tantalising promise and challenge of quantum algorithms. In certain cases it promises an exponential speedup for solving the common problem Ax=b (given a matrix A and a vector b, find the solution vector x). However, not only must the input data be in a quantum state, the ‘solution’ is output not as a conventional list of numbers but as a quantum state.

Iordanis Kerenidis, QC Ware

Iordanis Kerenidis, QC Ware

Iordanis Kerenidis (PCQC & QC Ware) took QCTIP through the main algorithms that quantum computing brings to machine leaning. The Recommendation System of Kerenidis and Prakash was the leading example of an algorithm that offered an exponential speedup over the best known conventional algorithm for a real world machine learning problem. The field took a hit in 2018 when Ewin Tang discovered an improved conventional algorithm that removed this formal advantage. However Kerenidis points out that for many practical problem sizes the quantum algorithm is still set to be much faster in operation (scaling as 103 rather than 1033, at least in the theoretical worst case).  Other interesting application examples include handwritten digit (MNIST) classification, Quantum Neural and Tensor Networks and Q-Means clustering.

Again, a major catch is that these algorithms require QRAM. Kerenidis emphasises the positive “QRAM is no more difficult to build than a quantum computer”. This may be true, but relatively little experimental effort has yet explored this problem.

NISQ machine learning?

Hartmut Neven, Google

Hartmut Neven, Google

Against this background, Hartmut Neven’s keynote address to QCTIP faced the challenge of pointing to where we could look for early wins from quantum machine learning. In contrast to the wide survey talks and meticulous presentations of specific results that had preceded him, Neven sought to motivate areas where the field can search for opportunities. Interesting areas include:

Training formulated as a hard optimisation problem: coherent quantum resonances potentially provide a mechanism to avoid the trap of local minimums. This might avoid the need to pre-clean training data sets. This is supported by work suggesting that there is indeed a region where quantum annealing algorithms can outperform classical annealing [46]. (Though many problems small enough to be addressed by NISQ hardware may also be solvable by brute force)

Quantum Adversarial Networks: the challenge of avoiding ‘false positives’ from trained neural networks is of increasing relevance, particularly when one considers applications that need to resist attacks from ‘bad actors’. Potentially the unique sample generating properties of quantum devices can assist in training networks against such attacks.

Quantum Neural Networks: rather than just accelerating the training of a conventional neural network, where could a Quantum Neural Network have an advantage?  This could be the case if we are trying to discover a correlation that cannot be described by conventional probability theory. We know nature has a quantum character, are there practical datasets where this might matter?  Neven flags that the output of quantum sensors could be an example where post-processing by natively quantum networks has application.

In Quantum Computing Since Democritus [19], Aaronson emphasises a subtle and profound point. Quantum mechanics is the inevitable mathematical framework to which we are forced if we want laws of nature that go beyond conventional probability theory (where probabilities must sum to 100%). The only other mathematically consistent system turns out to be that of Quantum Mechanics (where the squares of probability amplitudes must sum to 100%).

Despite the absence of mathematical proof, both Neven and Kerenidis convey an optimism that in the end quantum machine learning will be relevant in the NISQ era. This is built on an understanding that conventional machine learning is itself an empirical science – we don’t have algorithms that provably work, instead we proceed by trial and error. Perhaps it’s reasonable to proceed in the same way for quantum machine learning.

Languages & Libraries

Developing quantum software is hard. Having the most productive coding languages and support libraries to streamline the process is of rapidly increasing importance. Winning the battle for influence and control over the standards in this area is set to give a long term commercial advantage to the winning player.

IBM have enjoyed leadership in this area due to the pioneering success of the IBM Q Experience, and it has been noticeable at events [44, 48, 49] that presenters regularly tag the compatibility of their code with OpenQASM or forthcoming inclusion in Qiskit releases.

With plans for a major new ‘quantum data centre’, and putting emphasis on its NISQ friendly Cirq libraries, Google is set to drive a major shake-up of this segment of the field. The promotional videos shown at QCTIP indicate a drive to capture the ‘educational’ part of the market – those taking their first steps in quantum software.

Microsoft also presented at the QCTIP industry session. They emphasise Q# as a custom designed high level language and the benefits of integration within the well-established Visual Studio development environment. This seems in keeping with Microsoft’s ‘long timescale’ approach to the quantum sector: its topological qubits lag behind other technologies now, but could be revolutionary in time. It is also on these longer-term horizons that the benefits of high level languages are likely to shine.

Xanadu are a startup focussed on developing continuous variable photonic quantum computing. They took the opportunity at BQIT [48] to emphasise their Strawberry Fields platform and also PennyLane, a QML library that takes its inspiration from PyTorch the popular platform for conventional deep learning.

Error Correction

By definition, NISQ hardware is noisy. To move beyond this era and achieve the fault tolerant quantum computation (FTQC) required for larger calculations, either radically better hardware is required or error correcting protocols must be implemented. Key considerations include the fidelity threshold above which error correction can successfully operate and how to ensure a universal set of quantum operations can still be implemented.

Threshold

A key trigger for the explosion in quantum computing activity over the last two years has been the fact that the fidelity achieved by leading qubit technologies (at least in small experiments) has exceeded the 99% threshold requirement of a popular family of error correcting protocols, the Surface Code.

Better codes with a lower threshold requirement are highly desirable. The more we can operate above threshold, the smaller the number of physical qubits required to form a single logical qubit.

Naomi Nickerson, PsiQuantum

Naomi Nickerson, PsiQuantum

Naomi Nickerson presented to QCTIP the latest ideas from the PsiQuantum Fault Tolerance team.  We are used to thinking of one of the attractive features of the Surface Code as being the convenient 2D geometry of its implementation. Nickerson points out that it’s more accurate to think of this as a 3D fault tolerant channel, where ‘time’ plays the role of the third dimension. Equivalent structures can be defined using more complex 3D unit cells, some of which appear to hold the promise of better error thresholds.

PsiQuantum is a quantum computing startup focussing on silicon photonic technology. In contrast to the circuit based model of quantum computing typically assumed by superconducting qubit or trapped ion devices, or the continuous variable photonic approach of Xanadu, PsiQuantum are developing all-photonic measurement based quantum computing. The basic resource of this approach are highly entangled cluster states, which can be manipulated to perform the equivalent of qubit gate operations.  PsiQuantum has been renowned for its secrecy but is now rumoured to be in the process of raising its total funding to $130m, which would put it very much in the big league of quantum computing startups. Academic peers speak well of the high quality of the team that has been assembled. We can expect to see more as PsiQuantum continues to emerge from the shadows.

Magic States

The popular Surface Code protocols (and most similar approaches) come with the significant downside that they don’t allow direct implementation of the full set of quantum gates required to provide an advantage over conventional computers.

The mathematical explanation of this is that the Gottesman-Knill theorem tells us that quantum gates from the Clifford Group (such as those stabilised by the Surface Code) can be efficiently simulated on a classical computer. For a more physical intuition regarding what is going on, consider that the ‘stabilised’ states only give us access to a finite set of qubit settings. With nothing else added, this limits the complexity that can be built-up in the quantum calculation.

Earl Campbell, Univ. of Sheffield

Earl Campbell, Univ. of Sheffield

The leading proposed solution to implement the missing operations and complete the set of gates available, is that special qubit states are separately prepared and injected into the system. These whimsically named ‘magic states’ are perfectly feasible, but they are required in very large numbers.

The exact number of magic states required depends on the overall algorithm to be computed and the specific gates it requires, but they can quickly completely dominate the total required qubit count of the system.  In the future we may see dedicated hardware for this problem, however recent work has emphasised the algorithmic aspects of producing (noisy) magic states and then ‘distilling’ them to the required level of fidelity.

Daniel Litinski, Freie Univ. Berlin

Daniel Litinski, Freie Univ. Berlin

At QCTIP, Daniel Litinski (Freie Univ. Berlin) presented an approach to implementing Surface Code schemes that is evocative of how we can expect to see the complex underlying maths abstracted and simplified into the standard operations of a software layer (Litinski uses the term ‘game’). A feature of this approach is that it allows us to understand the trade-off between qubit count and cycle time for magic state creation.

It is indicative of the importance of this area that when  Ryan Babbush was presenting his work on FeMoco simulation, a significant part of the discussion focused on the optimisation of the required magic state factory (the approach based on CCZ states he proposes is x4 faster and requires  x3 fewer qubits).

Device Control

On-chip microarchitecture, control chipsets and firmware are an often forgotten part of the conventional computing stack. Given the imperative of coaxing the highest possible levels of fidelity from the quantum hardware, Fact Based Insight believes that these will be a much more important part of the quantum computing landscape, now and for the foreseeable future.

Speaking at QCTIP, Marcus da Silva (Rigetti) chose to emphasise the work Rigetti have done in understanding how to mitigate electronic control noise when working with superconducting qubits. By using a parametric approach to driving their qubits they are able to implement fast gates without undermining qubit coherence lifetime.

Joel Wallman, Quantum Benchmark

Joel Wallman, Quantum Benchmark

Joel Wallman (Quantum Benchmark) underlined the challenge of understanding noise and error characteristics in real devices, particularly the subtle effects of crosstalk between qubit control signals. Randomized Benchmarking is a technique for characterising noise in a scalable way. Error mitigating control techniques that can then be applied include spacing gate operations to minimise crosstalk and Randomized Compiling (alternating approximately equivalent circuits to reduce coherent noise effects).

Q-CTRL are another company active in this space. At the recent APS March Meeting [47] they unveiled their latest control-engineering inspired ‘firmware’ for qubit optimisation. Q-CTRL are part of the IBM Q network, and together with IBM have published the IBM OpenPulse specification that extends the capabilities of Qiskit in the device control layer.

How these functions will ultimately be spread across cryogenic control electronics, firmware or clever compiler optimisation is still not clear. Whatever the implementation, the IP leverage in making these systems work will be very valuable!

Actions for Business

Businesses unsure of how quantum computing will change their industry:

  • Do we understand the mapping between potential quantum computing applications and business operations along our value chain? Are the impacts likely to be incremental or disruptive?
  • For the specific types of problem in my sector, are the potential applications of quantum computing likely to be for the long term, or might progress be possible in the NISQ era?
  • What are my competitors doing? How and on what timescale do investors expect us to respond?
  • Do I have the right internal staff skills and external partners to seize these opportunities and respond to new threats in good time?
  • Do I need to take action now to future-proof any of my long term projects?

(For an innovative approach to these questions, check-out how leading companies such as Airbus are engaging with the quantum software sector)

Investors seeking opportunities in quantum software:

  • There is no guarantee that significant commercial applications will emerge in the NISQ era. Investors need to be ready to wait a long time for returns.
  • There is no guarantee that applications will not emerge in the NISQ era! Executive teams need to be ready to adapt to emerging opportunities or risk being eclipsed by more nimble rivals.
  • Investors need levers to help them manage expert executive teams. Encouraging continued dialogue and constructive criticism from the academic community is one tool; promoting engagement with early real-world commercial opportunities is another.
  • This stuff isn’t easy. Perhaps more than any other sector, the expertise of extremely bright individuals will be at an absolute premium. Business and financial models need to reflect this.
David Shaw

About the Author

David Shaw has worked extensively in consulting, market analysis & advisory businesses across a wide range of sectors including Technology, Healthcare, Energy and Financial Services. He has held a number of senior executive roles in public and private companies. David studied Physics at Balliol College, Oxford and has a PhD in Particle Physics from UCL. He is a member of the Institute of Physics. Follow David on Twitter and LinkedIn

7 comments

  1. David Shaw
    David Shaw -

    Jiang et al. of Tsinghua University, Beijing have recently announced the ‘experimental realization of 105-qubit random access quantum memory’ [50]. This isn’t the same as the QRAM referenced in this article, but it remains a very striking result.

  2. Pingback: Quantum enhanced blockchain – sooner than you think? – Fact Based Insight

  3. Pingback: Quantum error correction – from geek to superhero – Fact Based Insight

  4. Pingback: Quantum supremacy - a new era? – Fact Based Insight

  5. Pingback: Quantum safe cryptography – waiting to save the world – Fact Based Insight

  6. Pingback: Quantum software - beneath the quantum hype – Fact Based Insight

  7. Pingback: Quantum safe cryptography - the big picture – Fact Based Insight

Leave Comment