The central role of quantum software

Quantum thought leader Scott Aaronson reminds us that the challenge of achieving quantum advantage depends not just on breakthrough quantum hardware but also on mastering revolutionary new quantum software.

Scott Aaronson is a professor of Computer Science at the University of Texas at Austin, and the founding director of its Quantum Information Center. An expert in computational complexity theory, Aaronson is a world respected quantum thought leader and is perhaps best known for his popular book ‘Quantum Computing Since Democritus’. In a recent wide ranging interview with Y Combinator (the US seed funding specialist) Aaronson covered key themes in quantum information processing, illustrating both the challenge but also the short and long term importance of this work in realising practical benefits from quantum computers.

How quantum computers really work

Popular accounts of quantum computing often focus on the ‘spooky’ property of quantum superposition allowing us to ‘compute all answers in parallel’. However this is at best a gross simplification. Worse, it can actively mislead us into thinking that quantum software is straightforward to realise. Yes the quantum computer computes all paths in parallel, but when you take a readout then just one alternative will be randomly returned (the wave function collapses and the cat is either simply dead or alive).  This would be easy to programme, but would almost never be useful.  Aaronson gives an eloquent (and accessible) explanation of the key physical principles that actually unlock the potential of these systems  – superimposed quantum states interfere with each other (like ripples creating patterns on the surface of a pond).  By careful design – of what we call a quantum algorithm – we can control the interference pattern and so affect the probability of the ultimate readout.  A successful calculation results when we can manipulate the system so that the solution we are seeking is output with a high probability.

Fact Based Insight believes that this distinction is worth understanding. Many businesses are beginning to see the great opportunity that quantum computing brings. However there is a danger of focussing on just the hardware without understanding that it is really this new unique software that unlocks applications. Quantum algorithm and wider quantum software skills are and will be a scarce commodity, and companies wanting to gain early benefits will first need to build these capabilities within their teams and partner networks.

A near term application for Google’s random quantum circuits

While many people are excited that we may soon see a demonstration of quantum supremacy, the rather abstract nature of the random quantum circuits demonstration problem proposed by Google has seemed uninspiring to some. Fortunately Aaronson is working on adapting this into a genuinely useful real world application – a verifiable public source of random numbers.

Random numbers are important. Almost all forms of cryptography consume random numbers as an input (though what we often use today are pseudo-random numbers generated by software). Gambling machines require random inputs, otherwise casino owners can face attack by maths-minded criminals. Some applications specifically require ‘public’ random numbers – a number that multiple parties can all have confidence was truly randomly generated. Applications could include national service or jury selection, or selecting election returns to audit. Perhaps even more interestingly, public random numbers could be used in ‘proof of stake’ cryptocurrency schemes. These seek to replace the inefficiencies (e.g.  excessive mining energy consumption) that many see as a drawback in conventional ‘proof of work’ cryptocurrencies such as Bitcoin.

The issue of trust can be fraught.  In 2006 NIST promoted Dual_EC_DRBG,  a standard for pseudo-random number generation. However, following the memos leaked by Edward Snowden many see strong circumstantial evidence that the NSA deliberately introduced a backdoor into this algorithm.   Quantum processes are intrinsically random, and a number of companies have developed hardware based QRNG devices. However, these retain the limitation that others must trust the operator of the device to publish the output honestly.

Aaronson’s scheme for verifiable public random numbers plans to implement a statistical test to confirm that published output from the Google random quantum circuits algorithm is truly random. If successfully realised, this would be an ideal real world application for near term NISQ hardware.

The long term challenge of mastering complex quantum data

A long term research theme for Aaronson has been how information can be efficiently extracted from complex quantum states. The amount of data that a quantum computer can encode rises exponentially with the number of entangled qubits. This seems to create powerful data handling capabilities, but it’s not easy to work with such complexity, even in theory. If we take such quantum data as an input how can we process it in a realistic number of steps? The destructive nature of the quantum measurement process compounds the challenge – whenever I take a readout the input state is altered or even destroyed. Worse still, quantum states cannot simply be cloned but instead duplicates must be created from scratch. Learning about a quantum state by repeating measurements on its duplicates is possible, but only if the number of copies required is practically manageable.

It has long been understood that quantum state tomography (exactly surveying a quantum state by repeated measurement) is a task that rapidly becomes unfeasible for all but the simplest states (the record is just 8 qubits, for which 656,000 measurements were required).  However, Aaronson’s most recent results show that the specific task of discovering how the state will behave for any subset of measurements (its shadow) does remain possible. This shadow tomography makes use of ‘gentle’ quantum measurements that minimise the disturbance of the input state, and a protocol that limits the number of copies required.

This may seem abstract, but it has direct implications for future quantum copy-protection and quantum money schemes. Also much more generally, as businesses start to work with quantum computers particularly for data intensive applications such as quantum search, quantum optimisation and quantum deep learning, they will likely start to find that a key bottleneck for the speedups they are seeking is the preparation of the quantum input data required by the algorithm. In a very real sense theoretical advance in efficient software handling of this data is every bit as important as the continuing development of quantum computing hardware.

P vs NP in the real world

Computational complexity theory deals with the formal comparison of how hard different classes of problem are to solve. Confirming the conjecture that two such classes, P & NP, are distinct is one of the most famous open problems in mathematics. While many experts have the intuition that P ≠ NP, there is no formal proof. This matters because almost all of modern maths-based cryptography (and therefore cyber security)  is built on the assumption that this conjecture holds. Separately, this point is also closely related to proving that there is a class of problems where quantum computers will always enjoy an advantage over conventional devices. Again this is conjectured but there is no mathematical proof. So how should we feel when we implicitly base business decisions on the assumption that P ≠ NP?

Quantum Technology is an interdisciplinary field: mathematicians, computer scientists, physicists, engineers (and increasingly business) all have key roles to play.  Aaronson understands the different cultures, perspectives and the lexicons involved.  In explaining how we should think about the open P vs NP problem, Aaronson makes a joke that sets out a useful insight “If we were physicists, we would have declared P ≠ NP to be a ‘law of nature’ and give ourselves Nobel Prizes … and later if it turned out that in fact P = NP we could give ourselves more Nobel Prizes for the law’s overthrow”. Mathematicians and physicists are indeed two peoples separated by a common language. If your business is looking for a long term quantum safe cyber security strategy and you are trying to evaluate the relative merits of PQC and QKD, then this is one leg of the debate that could be coming to your boardroom.

Overall, Y Combinator’s interview with Aaronson strongly reminds us of the key role that quantum information processing theory and the resulting quantum software will play, alongside quantum hardware, in the coming revolution.

David Shaw

About the Author

David Shaw has worked extensively in consulting, market analysis & advisory businesses across a wide range of sectors including Technology, Healthcare, Energy and Financial Services. He has held a number of senior executive roles in public and private companies. David studied Physics at Balliol College, Oxford and has a PhD in Particle Physics from UCL. He is a member of the Institute of Physics. Follow David on Twitter and LinkedIn