Introduction to Quantum Information Processing

Quantum computers offer no advantage in and of themselves. Their true potential is only realised when they run new quantum algorithms.

Conventional computers have benefited from 50 years of continuous development. The speed of their underlying logic gate operations will be significantly faster than the speed of quantum gate operations for the foreseeable future. Take a conventional programme and translate it to a quantum computer and it will almost certainly run more slowly.

Why all the fuss then? The answer to this, and really the entire point, is that quantum computers can run new quantum algorithms that solve problems either entirely unfeasible on conventional computers, or at least in many fewer steps.

Computational Complexity Theory

How quick is quick? Computational complexity theory is the branch of theoretical computer science that formally studies how hard computational problems are to solve. It is a remarkable field in its own right bringing a rich framework for analysing problems both in comparative and absolute terms.

Using these techniques it has been shown that certain quantum algorithms provide remarkable speedups over their best known conventional counterparts.

The time it takes for an algorithm to finish often scales with the size of the input, e.g. finding the factors of a number N digits long, or searching an unsorted list of N items.

Exponential Time – the algorithm requires more than 2N steps to solve, where k is a constant. This rapidly becomes very, very large as N increases.

Polynomial Time – the steps required rise as Nk, where k is a constant. Even if k if large, this is never as long as exponential time once N is sufficiently big.

Sub-linear Time – forms such as √N and log(N) increase considerably less rapidly even than a direct linear relationship with N.

A quantum algorithm provides an Exponential Speedup if it moves a problem from exponential to polynomial time. Colloquially: this often turns an impossible problem into a feasible one.

A quantum algorithm provides a Quadratic Speedup if it improves the number of steps by a square root. Colloquially: not nearly as powerful as an exponential speedup but still often a very big difference in practice.

However, creating quantum algorithms is far from easy. A quantum computer can hold large amounts of data in superposition, but we can’t simply ‘compute all possible answers in parallel’. When we take an output, the superposition collapses back to just a single conventional state. The essential trick is to manipulate quantum interference effects so that they steer the collapse onto the right value. That sounds hard, because it is hard.

20 years of research means that a basic set of quantum algorithms have already been developed.

Quantum Algorithms

Quantum simulation

Simulating the physics of quantum systems was the problem that, in 1982, Richard Feynman initially conceived of quantum computers to solve.

Though quantum mechanics is thought to describe nature (gravity excepted) in precise detail, the calculations required to make predictions on a conventional computer scale exponentially with the complexity of the system being simulated. For example, in quantum chemistry moving beyond all but the simplest molecules in practice requires the use of approximations. It is perhaps intuitive that a quantum computer offers an exponential speedup to problems of this sort.

The scientific community often references applications for quantum simulation in fundamental scientific research and that is of course potentially ground-breaking. However, there are a large number of nearer term commercial areas that can also expect to see direct impact.

Over the last 60 years, the average cost of developing a new medicine has increased steadily and pharmaceutical companies now typically invest $1b for each new drug that reaches the market, over a process that typically takes 10 years (7). Techniques from quantum chemistry, molecular dynamics and free energy perturbation theory are already used, but these promise to be significantly enhanced by the ability to undertake direct calculations:

  • Characterising and understanding molecular targets for new pharmaceutical drugs (e.g. within genomic mutations in a cancer)
  • Screening active molecules as drug candidates (e.g. by understanding shape, folding and likely molecular bonding modes)
  • Designing alternative molecular structures for delivery (e.g. overcoming inhibiting mechanisms)

More widely we are already entering a golden age of materials science. Quantum simulators are an enabling tool to take this further. Detailed understanding of material structures should drive innovation in a number of areas (8):

  • Enhanced energy density and thermoelectric performance (e.g. batteries, solar cells, fuel cells)
  • High temperature superconductivity (e.g. maglev transport, power transmission)
  • Enhanced alloys (e.g. corrosion resistance, aerospace)
  • More efficient chemical processes and catalysts (e.g. fertilizers, refining)

Naturally, quantum simulators could be very relevant to this area. However the application of these techniques is not a well understood field. Learning from early experience will be essential.

Quantum Fourier Transform

Fourier analysis is a long established and widely used technique in many data intensive fields, e.g. signal analysis, pattern recognition, image processing, and engineering. Where it can be applied, the quantum Fourier transform provides an exponential speedup over the best known conventional equivalent, the fast Fourier transform.

Shor’s algorithm is the most discussed use of this technique, which enables it to solve integer factorisation and discrete logarithm problems much faster than the best known conventional algorithms. This has attracted much attention because the ‘difficulty’ of these problems is at the heart of the security of current public key cryptography.

Many other variations of this technique have been researched.

Quantum Search

This is known more technically as amplitude amplification. It is intuitive to see that searching through an unsorted list of items conventionally takes time that scales up directly with the size of the list N. Can we make use of the compressed form of the qubit data to improve on that? Remarkably it turns out that the answer is yes.

Grover’s algorithm allows this search to scale as only √N steps. Technically this is only a quadratic speedup, but if the value of N is large it is still a very significant difference.

This basic approach also implies benefits to many related problems that can incorporate this step.

Quantum Sampling & Optimisation

Sampling probability distributions is an important technique for many conventional algorithms that depend on a statistical approach. Applications include random walk and Monte Carlo simulation techniques in wide ranging fields from Financial Markets to Engineering to Internet page ranking.

Brandao & Svore (9) have proposed the use of a quantum sampling algorithm to enable a quadratic speedup for solving semidefinite programmes. Semidefinite programming is a powerful generalisation of the more widely known linear programming approach. This computational technique has wide applicability in finding the optimal solution (strategy) across a complex system. Commercial applications include process industries such as Oil Refining and Food & Beverages, and also widely in Logistics.

Big Data, Machine Learning and AI

Big data is defined as data sets so large and complex that conventional data processing techniques struggle to deal with them: too big for computer memory, too big for Excel, not clean enough for easy processing, too disparate for ordinary structured databases. Given the proliferation of data generated by modern society, finding ways to interpret and extract commercial insight from this data has naturally become a pressing area of business interest.

The anticipated ability of quantum computers to encode vast amounts of data makes them an attractive potential fit for big data processing applications. However as noted earlier, just holding the data is of limited use. The key challenge is to identify the algorithm or combination of algorithms that can process the data efficiently. Quantum Fourier transform, quantum search and quantum sampling all offer promise.

Machine learning deals with techniques that automate the task of gaining insight from data. The learning or training process is often data processing intensive. Many real world examples of commercial interest involve processing big data streams (e.g. medical diagnosis, natural language processing, and consumer preference). Others may involve simple inputs but require an exponentially expanding working space for evaluations. Again these seem attractive areas where quantum computing can add unique value.

Composite quantum algorithms offering speedups for deep learning, perhaps the highest profile machine learning technique, have been proposed.  As well as for a number of other established techniques such as K-means clustering, principal component analysis, support vectors, classifier training and others. However, efficiently preparing the required input training data in quantum memory (QRAM) remains an anticipated research challenge.

Man vs Machine

In 1996 Deep Blue demonstrated that a computer could learn chess well enough to beat world champion Garry Kasparov.

In 2016 AlphaGo succeeded with the computationally more difficult game of Go, beating world champion Lee Sedol.

In 2017 AlphaGo Zero took this approach to a new level. Rather than being trained using thousands of recorded human games, it taught itself ‘tabula rasa’ (without foreknowledge) based simply on the rules of Go and its own internal simulations.

These historic cases are based on conventional computing. However the exponential nature of these learning problems makes them a likely target for quantum algorithms.

Artificial Intelligence is the wider field of devices and systems that make decisions for themselves. Simple AI is already commonplace, more complex AI the source of much active research. The potential application of quantum computing in machine learning already makes it a likely key enabler for future advances in AI.

Some believe that the role of quantum phenomena is even more deeply embedded as a requirement for emergent AI. Existing work has often made use of classical neural networks. Recent advances are unearthing a quantum basis for the firing of neuronal microtubules. However, the implications of this remain controversial (10).

Role of Business

In the early 1940’s, Alan Turing and an elite group of code breakers at Britain’s Bletchley Park developed the first electric programmable computer because they anticipated its value in cryptanalysis. Breaking the German Enigma code went on to make a major contribution to the Allied victory in the Second World War. However, even this talented group did not appreciate the wider uses to which computers would later be put and the way they would change and reinvent whole business sectors.

Similarly now, the early public discussion of quantum computing has been dominated by its headline application against cryptography. It is already clear that its capabilities go well beyond that. Its potential will probably only really be understood as entrepreneurial businesses also engage in the journey.

This comes with a challenge however. The full set of skills required to fully appreciate this new field not only encompass quantum science, mathematics and computing, but involves new sub-disciplines in those fields that were not even taught in general degree level courses 10 years ago. Even now, many companies already struggle to apply existing big data analytic techniques optimally. Established experts may initially be sceptical of the relevance of new approaches.

Early movers and those seeking advantage will need to think hard about how to bring the right blend of skills to their organisation and how to encourage innovation in these new areas.

David Shaw

About the Author

David Shaw has worked extensively in consulting, market analysis & advisory businesses across a wide range of sectors including Technology, Healthcare, Energy and Financial Services. He has held a number of senior executive roles in public and private companies. David studied Physics at Balliol College, Oxford and has a PhD in Particle Physics from UCL. He is a member of the Institute of Physics. Follow David on Twitter and LinkedIn