Introduction to Quantum Computers

An immediate challenge for the interested business observer is how to interpret the true state of progress against a background of hype and corporate spin.

All conventional computing is ultimately based on data stored as digital bits, each a 1 or 0, and their manipulation via logic gates. The great power of modern computing comes because these operations can be performed by transistors embedded in large numbers within integrated circuits, what we call microchips.

Crucially, for many years micro-fabrication technology has allowed a consistent scale-up in how many transistors can be crammed into one device continually improving performance – an effect we know as Moore’s Law. The first commercial Intel microprocessor, the 4004 in 1971, had just 2300 transistors, now a top-end desk top machine can have over 2 billion. However this period has now reached its technological end-point.

Beyond the Digital Era

Quantum computing replaces the bits and logic gates of conventional computing with qubits and quantum gates. These use the novel properties of quantum mechanics to implement nnew types of quantum algorithm.This turns out to be a big deal. In the digital era, a number which is N binary digits long can encode N pieces of information. A system of N qubits can in principle encode up to 2N bits of information (2 multiplied by itself N times) in superposition.

Qubits – Quantum bits are the basic information store of a quantum computer. A conventional computer stores a ‘bit’ of information as 1 or 0. A qubit stores a controlled combination of 1 and 0 simultaneously in superposition.

Quantum Gate – At its lowest level, a conventional computer processes data via logic gates. Quantum gates perform additional unique operations only possible on qubits.

Quantum Algorithm – Conventional computers run programs which apply a mathematical algorithm to data. Quantum gates are used to implement unique quantum algorithms.

Universal Quantum Computer – An idealised perfect quantum computer, reliable and programmable to perform any quantum algorithm. Several conceptual models have been shown to be equivalent in principle: the circuit model (gate model), topological model, and adiabatic quantum computer.

Quantum Simulator – a simpler, specialised quantum computing device optimised to solve problems in simulating physics, materials science or chemistry. By extension these optimisation and sampling techniques can also be applied to other commercial problems of interest.

To bring this to life, consider that one human genome sequence contains about 1.5 gigabytes of data. 1 byte is 8 bits, so that is approximately 234 bits, so just 34 logical qubits would be enough to hold a representation of the complete sequence. An entire human body of 100 trillion cells might contain 150 zetabytes, approximately 280 bits, so 80 qubits would suffice.

However a business observer would naturally be cautious “People have been talking about quantum computers for many years. Can such a device really be built now? Could it be scaled up reliably? Could we even do anything with data stored in this weird way?”

Understanding what current headline claims of 17, 50 or 2000 qubits really mean is the focus of the next section.

Universal Quantum Computers

Building a quantum computer is not easy. Exquisite engineering is required to keep the system isolated from its environment and operating in the quantum domain but still under precise active control.

An ideal device would have stable, long-lived qubits. It would be fault tolerant and offer a full set of quantum gates, all operating at high speed and high fidelity (low error rate). Crucially it would be easy to scale up the number of qubits actively available for use in computations.

Qubits

Groups around the world are seeking to harness a variety of different underlying qubit technologies. These each have unique strengths and weaknesses:

Estimates of the number of computational qubits required to complete sample tasks vary, but typical numbers include:

  • 50-60 qubits for ‘Quantum Supremacy’, a quantum computer able to complete a (carefully selected) calculation unfeasible on any conventional computer
  • 100-150 qubits to tackle calculations in quantum chemistry
  • 4000 qubits to break existing public key encryption standards

But are these qubits the same as the ones often mentioned in press releases?

Current headline claims almost always refer to physical qubits realised in one of several leading technologies (2)(3).

Superconducting Circuits – The technology recently featured in announcements by IBM, Google and Intel. Gate fidelities above the Surface Code threshold for error correction have been demonstrated. Fast gate times compensate for more limited qubit coherence lifetimes. The qubits need to be cooled to ultra-low cryogenic temperatures, but the ability to use established micro-fabrication techniques is considered a major plus in terms of likely scalability.

Trapped Ions – One of the most mature technologies, with superior qubit lifetime and gate fidelity performance, again above the Surface Code threshold. Doesn’t need cryogenic cooling, but does need a vacuum. Recent advances in microwave based control hold promise to make this technology even more appealing by providing new options for scalable modules.

Silicon Spin – More precisely electron spins within a quantum dot embedded in silicon. This area that has really surged forward in the last 12 months. The benefit of CMOS fabrication and microwave based control promises great scaling characteristics, though cryogenic cooling is required. Long qubit lifetimes have been reported, and a 2-qubit logic gate demonstrated.

Photonics – This technology requires no cooling and has a natural affinity with modern optical communication technology. Advances in single photon nanowire technology and integrated waveguide fabrication are positive developments. Fidelities demonstrated are only slightly lower than other techniques.

NV Diamond – Long qubit lifetimes encased in a material with many favourable properties. No need for cooling or vacuum. The ability to precisely etch the required nitrogen-vacancy defects into diamonds has significantly expanded the potential of this technology. However current fidelities are significantly under threshold.

Topological – Much less developed in terms of physical realisation, the theoretical study of the required quasiparticles holds out the promise of very long lived high fidelity qubits. Microsoft has backed research into Majorana quasiparticles as part of its ‘full stack’ approach to quantum information processing.

Fault Tolerance

However, to scale-up a reliable quantum computer, physical qubits need to be encoded into logical qubits to correct errors and allow fault tolerant performance of quantum gates. This is not simply an engineering challenge. Indeed, a superficial understanding of quantum mechanics with its inherent indeterminacy might lead to the presumption that error correction isn’t possible at all.

This problem has been studied for over two decades and crucial progress made. A number of theoretical error correcting codes have been proposed, each with differing requirements for implementation.

A basic requirement is the underlying physical qubit fidelity threshold for single and two qubit operations. Below this threshold, the error correcting code can’t keep up. The threshold requirements of different codes range from the unrealistic 99.99998% to the much more practical 99%.

Underlying encoding of physical to logical qubits also differs, spanning ratios from 5:1 to 17:1 to 49:1 and upwards, however a simple ratio glosses over the vital issue of which quantum gates are supported for fault tolerant execution.

Surface Code

A key factor in the recent surge in commercial investment in quantum computing is that the practical ability of the experimenters to manipulate physical qubits now overlaps with the theoretical requirements for a leading error correcting code: the Surface Code (4).

The Surface Code combines attractive features: fidelity threshold of just 99% – which is exceeded by several current qubit technologies; simple 2D geometry, with a requirement only for neighbouring qubits to interact.

However the Surface Code (and many similar codes) has a significant weakness, it does not allow direct implementation of a full set of quantum gates, and the missing gates are in fact essential to providing computational advantage over a conventional computer.

Magic States

The leading proposed solution to implement the missing operations and complete the set of gates available, is that special qubits states are separately prepared and injected into the system. These whimsically named ‘magic states’ are perfectly feasible, but they are required in very large numbers.

The exact number of magic states required depends on the overall algorithm to be computed and the specific gates it requires, but they can quickly completely dominate the total required qubit count of the system. A recent estimate of the resources required to break existing public key cryptography is at least 6 million physical qubits, most dedicated to operating as a Magic State Factory (5).

6 million qubits is a lot, but it is perfectly realistic to see as a scale-up of current technology over time.

Quantum Simulators

While a universal quantum computer can implement any quantum algorithm, they aren’t the only approach to solving real world problems.

Quantum simulators use the same technologies as universal quantum computers, but do not necessarily need the same qubit properties, error correction or quantum gates. The only real requirement is that they work in practice to solve the problems for which they are intended. A number of disparate approaches are possible under this heading.

Adiabatic Optimisation – Set-up as a physical analogue of the problem to be solved. Ultra-cold atoms in optical lattices are one possible implementation of this approach.

Quantum Annealing – A form of adiabatic optimisation carried out at higher temperature. The limitations this compromise brings are not fully understood. Higher temperature means a balmy 0.015 Kelvin in this case.

Specialised Quantum Processors – Small-scale quantum computers, not necessarily fault tolerant.

Many such devices can be expected in the era of noisy intermediate-scales quantum (NISQ) devices before fault tolerant quantum computation (FTQC) becomes possible.

Quantum simulators have a natural affinity with the application of simulating problems in quantum science. However that will not be their only application. We can expect to see different technologies and individual devices specially adapted to specific problems. We are likely to see parallel streams of progress rather than a single ‘winning platform’.

D-Wave Systems’ family of quantum annealing devices, most recently the D-Wave 2000Q, have been the first commercially available machines. Sometimes controversial press comments stem more from comparing these to what they do not pretend to be – universal quantum computers. Equally we should not extrapolate assumptions about what these devices can do as their theory is not fully understood. Observing real success in commercial applications is key.

Timeline

Developing this technology is not easy, but current headlines have demonstrated very real progress. It is no longer tenable to question whether a large scale quantum computer will ever be built; it is only a matter of time.

  • Now: Pilot and prototype devices are already with us. One of Google, IBM or Intel is likely to win the race to demonstrate quantum supremacy shortly. D Wave Systems’ customers include Lockheed Martin, Volkswagen and Google/NASA.
  • 3-6 Years: The EU’s Quantum FET Flagship foresees basic applications for quantum simulators within the next 6 years, significantly ahead of large scale quantum computers (6).
  • 10-15 years: Large scale fault tolerant devices able to break current encryption standards. Sir Peter Knight, co-leader of the UK’s Blackett review into quantum technologies suggests we should be ready for this in 10 years; Simon Benjamin of Oxford University estimates 8-12 years with current investment, or 5-7 if carried out as a priority by a state-level actor (4).
David Shaw

About the Author

David Shaw has worked extensively in consulting, market analysis & advisory businesses across a wide range of sectors including Technology, Healthcare, Energy and Financial Services. He has held a number of senior executive roles in public and private companies. David studied Physics at Balliol College, Oxford and has a PhD in Particle Physics from UCL. He is a member of the Institute of Physics. Follow David on Twitter and LinkedIn