Quantum error correction – from geek to superhero

Quantum computers can only realise their full potential by defeating the noise that plagues their delicate qubits. The field of quantum error correction offers some increasingly mature solutions. But it is also still a rapidly evolving discipline, and innovation here, as much as in hardware, will shape the future quantum computing roadmap.

If 50-70 qubits are thought to be sufficient to demonstrate quantum supremacy, why in practice do we think we need so many more? Indeed, around the world groups are working towards the long term goal of quantum computers containing millions of qubits [1,35,53].

The problem is noise. Quantum systems are typically very sensitive to external interference. Many groups will push to see what can be done with NISQ devices, trying to complete their calculations before noise overwhelms them. But to realise the full potential of quantum software we need a mechanism to prevent errors piling up uncontrollably in longer calculations – we need fault tolerant quantum computation, FTQC.

Conventional computers have error correction built into their very conception by being digital instead of analogue. Calculations are arranged over a series of clock cycles, and after each step the circuit hardware kicks the bit back to the nearer of 0 or 1 [58].

Quantum mechanics is intrinsically different. The physical qubits are not just 0 or 1, but an analogue superposition of both and are entangled with other qubits involved in a calculation. They cannot simply be cloned. If we naively try to check and correct their value the quantum calculation could collapse prematurely.

Andrew Landahl, Sandia National Labs

Andrew Landahl, Sandia National Labs

As Andrew Landahl (Sandia National Labs) points out “We don’t actually have the quantum equivalent of the digital bit yet, for that we need a working logical qubit”.

Quantum computing was first conceived by Richard Feynman in 1982, but it only really started to attract mainstream attention following the revolutionary work of 1996 that showed reliable FTQC was in principle possible.

Much subsequent work has explored a variety of approaches. In a typical approach noisy physical qubits are encoded into a robust logical qubit using a QECC. Carefully designed stabilizer checks are used to detect and correct errors without disturbing the logical qubit state. Fault tolerant operations are used to perform quantum gates on the logical qubits. Families of protocols compete in terms of how practical they are to implement and the processing overhead they introduce.

There has been a surge in interest in quantum computing over the last two years. This has been driven because the ability of leading experimentalists to build and control would-be quantum devices has started to overlap with the requirements of one leading family of protocols, a topologically based QECC known as the surface code.

Surface Code – logical qubits are built from multiple physical qubits in a 2D lattice.

  • Works for physical qubits and gates with errors below a threshold of about 1%.
  • Interactions are only required between ‘neighbouring’ qubits in the 2D lattice.
  • A wide variety of quantum gates can be implemented efficiently, though one short of that needed to constitute a universal set of quantum gates.
  • The missing gate can be reinstated by a technique known as magic state injection, though this introduces a significant overhead and is usually considered the likely bottleneck for practical calculations.

In much popular coverage, the surface code is implicitly assumed as the ‘standard model’ of quantum computation. While this approach demonstrates the feasibility of this FTQC, the number of physical qubits required for useful calculations is prodigious: 4096 logical qubits are theoretically enough to perform the integer factorisation required to break the 2048-bit RSA public keys used today for Internet security. Typical estimates have put the size of the surface code based quantum computer required to implement this at perhaps six million physical qubits [5].

It is important to realise that the exact number of qubits required depends on a detailed interplay of the assumed hardware specs (fidelity, interconnectivity, gates supported) and the capabilities of the error correcting protocols these support. Each protocol has its own characteristics (logical cell size, ancilla qubit requirements, decoding speed). The vast majority of the physical qubits required come from the needs of the error correction, not the underlying quantum algorithm.

Daniel Gottesman, Perimeter Institute & Quantum Benchmark

Daniel Gottesman, Perimeter Institute & Quantum Benchmark

Even as hardware groups around the world are improving the size and performance of their prototypes, the capabilities of quantum error correcting protocols is also continuing to progress markedly. Reflecting on his own and others recent work, Daniel Gottesman (Perimeter Institute and Quantum Benchmark) observes “There is no reason why we couldn’t see a big breakthrough on QECC techniques”.

Fact Based Insight believes that this is an area that deserves close attention from those with a commercial interest in quantum technology. Just as much as with a hardware breakthrough, significant developments in error correction techniques could disrupt current assumptions about the timeline and roadmap to commercial applications. Different hardware architecture will not fare equally with such development. When we consider just how central error correction has been to our very conception of a digital computer, we can see that a significant change in thinking could be very disruptive to our current assumptions about how the future sector will look.

Quantum Error Correction 2019 (QEC19) [59] was a good opportunity to catch-up with what has been going on at the cutting edge of this dynamic field. This community takes a wide view across the techniques being adopted:

  • Pushing the state of the art with the Surface Code
  • Engineering for error correction compatibility
  • New codes for new architectures

(For a non-mathematical overview of the concepts behind FTQC, please read our introduction)

Pushing the state of the art with the surface code

The increasing finesse of surface code approaches was much on display at QEC19. Daniel Litinski (Freie Univ. Berlin) illustrated his ‘board game’ inspired approach to abstracting surface code operations. This gives a welcome sense that the complex maths is giving way to what increasingly feels like the abstraction of a software stack layer.

Litinski’s recent work has used these techniques to focus on the challenge of creating practical layouts to optimise the handling of magic states. Previous studies, assuming less efficient methods, have often chosen to neglect all requirements other than magic state production in estimating resource requirements. Litinski’s approach suggests that layouts with only about a third of qubits given over to magic state production may have utility, significantly less than often previously assumed.

Ryan Babbush, Google

Ryan Babbush, Google

This rapid continuing refinement of magic state algorithms was also evident in the work Ryan Babbush (Google) presented on the simulation of FeMoco. As recently as April, Babbush’s best estimate for this simulation was a 1 million qubit device running for a month. With improved techniques, magic state production is now estimated to take just 1 week. Babbush thinks further significant improvement is possible, though estimation of the other elements of the algorithm now requires more attention.

FeMoco (Fe7MoS9C) is the active site within the Nitrogenase enzyme responsible in nature for the catalytic conversion of nitrogen gas into ammonia (fertilizer). If we could better understand FeMoco then potentially we could improve the energy intensive Haber process used for the industrial production of fertilizer. This would not just be a commercial breakthrough, but also one with very positive implications for feeding the planet and limiting climate change. Fact Based Insight sees ample space for a mission to be defined on this or a related simulation to capture public imagination around the positive potential of quantum computing – better to save the planet than break the Internet!

The experience of Babbush and his team, hints at several wider messages: we can expect to conduct commercially valuable calculations in quantum chemistry using error corrected devices of less than 1million qubits, perhaps as small as 100,000 qubits. Insight into conventional techniques in computational chemistry is also at a premium. As Babbush points out it’s important to only take that part of the calculation that really needs the speed-up onto the quantum device. Early movers such as quantum software startup Riverlane (also represented at QEC19) are moving to secure such talent within their growing teams.

Physicists are often trying to remove bias from their experimental noise. However for the purposes of quantum error correction a known bias may be something that can be exploited. In many practical quantum devices noise is biased towards phase (rather than bit-flip) errors. David Tuckett (Univ. of Sydney) showed how a modified surface code could be tailored around a known bias. Recent work suggests that such techniques could move the threshold for fault tolerant computation to ∼ 5%.

Engineering for error correction compatibility

For a long time the study of quantum information processing was a purely theoretical endeavour and we still don’t have a fully functional logical qubit. However as things become more practical it’s not surprising that a significant focus is developing on working with errors in the real world.

Maika Takita (IBM) took QEC19 through her experimental work testing a simple code (technically a [[4,2,2]] error detecting code) on IBM’s Q5 device. This has been used to demonstrate stablizer operations and inform the experimental design for the planned demonstration of a full logical qubit. A key consideration has been a design that supresses the propensity for cross-talk between qubit gates. The planned logical qubit will require 23 physical qubits in a ‘heavy hex’ layout and is intended to be implemented on a 27 qubit device.

Michael Biercuk, Q-CTRL

Michael Biercuk, Q-CTRL

Michael Biercuk (Q-CTRL) emphasised the importance of emerging ‘quantum firmware’ in reducing, homogenizing, and virtualising errors at the physical layer and so presenting an optimised profile to quantum error correction routines within the software stack. This promises significantly reduced overheads. Q-CTRL specialises in the application of control engineering and machine learning techniques to the design of quantum gate driver signals. The company sees the interface between such control and practical quantum error correction as a major opportunity. Biercuk summarises “By combining QEC with low-level quantum firmware – instead of treating them as fully independent layers of the stack – we can potentially reduce the number of physical qubits required for QEC encoding by orders of magnitude. That means more hardware can be dedicated to useful computation”. Zurich Instruments (also present as a sponsor of QEC19) are already well known in the field for providing electronic packages suitable for the high precision waveform generation and readout required for such applications.

Simon Benjamin, Oxford Univ. & QMT

Simon Benjamin, Oxford Univ. & Quantum Motion

Zhenyu Cai (Univ. of Oxford) described some exciting developments in architectures for silicon spin qubit based devices. With the opportunity to leverage known CMOS fabrication techniques and the promise of dense qubit layouts, such devices may be a path to radically scaling up qubits numbers on a single chip. However in such scaled-up devices they face the problem of how to pack in sufficient classical control lines and how to deal with leakage errors (where the active qubit electron is lost). Cai presented a novel hardware solution based on elongated ‘mediator’ dots between a lattice of quantum dot pairs. These provide a practical mechanism for key tasks (such as 2-qubit gate interactions, stabilizer checks and leakage restoration). Intriguingly this work was partly sponsored by Quantum Motion. This recent startup is still in stealth mode. As it emerges we can expect it to leverage some or all of these techniques. Simon Benjamin (Quantum Motion and Univ. of Oxford) was also present at QEC19 presenting on his parallel work on error resilient variational algorithms using the QuEST platform.

New codes for new architectures

The surface code does have drawbacks, notably the need for magic states and the unwelcome scaling up of overheads at larger sizes. More forgiving error thresholds would also be welcome. To try to do better, a variety of other QECC ideas are being developed.

One notable example is another family of topological codes known as ‘color codes’. Developed by Héctor Bombín (now of PsiQ) and others, these potentially offer several advantages: the qubit overhead is lower, gates can in some cases be performed more quickly and, providing we use 3D color codes, the need for magic state distillation can be avoided.

Unfortunately color codes have seemed to set a more demanding error threshold than the surface code and be more difficult to process efficiently. Aleksander Kubica (Perimeter Instritute) showed QEC19 his latest work closing these gaps. Kubica points to new decoders inspired by their surface code equivalents that perform on a par. This links well with Bombin’s recent work showing how measurement based quantum computing (a paradigm native to the photonic quantum computing being pursued by PsiQ) can be used to unlock the potential of 3D color codes while still using a physically 2D device architecture.

Naomi Nickerson, PsiQ

Naomi Nickerson, PsiQ

Naomi Nickerson (PsiQ) presented further striking details on the 3D cluster state scheme her team have been working on in parallel. She points out that conventional topological QECCs such as the 2D surface code implicitly use time as an additional dimension. Significant work in the field has already gone to evaluating the trade-off of various 2D unit cell arrangements. However if we allow ourselves to work with generalised 3D cells (that don’t necessarily line-up neatly along the time axis) then it turns out that we have many new interesting possibilities to play with. Remarkably the PsiQ research has thrown up a number of possible 3D unit cell shapes that appear to offer significantly superior threshold performance over schemes such as the conventional 2D surface code.

PsiQ’s stealthy roadmap? If these threads can be drawn together they could overturn the quantum computing roadmap. Photonic based quantum computing has many potential advantages, not least the strong affinity of its technology with conventional investments in silicon photonics and its natural fit with distributed quantum computing and the quantum internet. Its key drawback has been relatively poor fidelity performance when compared to the requirements of standard approaches such as the surface code.

One of the most tantalising results of recent years has been Daniel Gottesman’s pointer towards new families of LDPC QECCs that in theory could have much lower overheads at large scale. Gottesman’s results show that, in principal, protocols that encode large numbers of logical qubits as a single block can support fault tolerance computation with a constant (rather than polylog increasing) overhead. However, no known code has yet fully met the required characteristics for this approach.

Omar Fawzi (ENS de Lyon) presented some of the latest developments in this field, including the existence of quantum expander codes meeting Gottesman’s requirements. Pavel Panteleev (Moscow State Univ.) also presented in this area pointing out that they had been able to benefit from Huawei’s experience with conventional LDPCs.This could set the ground for further future developments, though it should be noted that the required error threshold is likely to be significantly more demanding than those being achieved today and the benefits of this approach will only be applicable to systems of large size.

Tony Cubitt, UCL

Tony Cubitt, UCL

Many presentations at QEC19 explored areas beyond conventional ‘qubit model’ quantum computing. Work on how errors can be controlled in such systems has typically lagged the wider field. Several presentations showed that thinking in this area continues to progress. Margret Heinze (TUM) described dynamic coupling techniques adapted from the field of NMR. Giacomo Pantaleoni (RMIT) discussed techniques for continuous variable quantum computing using GKP states. Interestingly these approaches are relevant to bosonic systems. This area is potentially an attractive niche for quantum simulators, seen by some as a possible area of commercial activity during the NISQ era.

Simulators are often seen as having to be dedicated to one target system. Toby Cubitt (UCL) showed how a universal quantum simulator can theoretically be constructed to simulate the dynamics of any other many-body quantum system. This work also points towards justification that the results of such a model can be valid even without error correction.

Quantum sensing applications typically depend on ultra-precise measurement (metrology). Liang Jiang (Yale) demonstrated how QECC techniques can be used (in certain circumstances) to extend the sensitivity of such devices, a theme also discussed by Simon Benjamin. These ideas are particularly relevant due to the shorter term horizons with which such devices are being commercialised.

An evolving community

Earl Campbell (Univ. of Sheffield), Dan Browne (UCL)

Earl Campbell (Univ. of Sheffield), Dan Browne (UCL)

Quantum error correction is a complex and specialist field. From its roots in 1996, the community attending the Quantum Error Correction conference series has grown from about 90 in Los Angeles in 2007 to 175 in London in 2019.

While the outlook of the conference is global in the best academic tradition, the current make-up of the field is interesting. The strength of the US is very pronounced and even given the non-US venue about a third of QEC19 delegates were from the US. 50% of invited speakers were from the US and overall 75% were from the wider anglophone nations of the US, UK, Canada and Australia.

Conference sponsors included such names as Alibaba, Baidu and Huawei, however in contrast to many other international quantum events China-based academics were notable by their absence.

The EU’s existing initiative in this area, QCDA (co-funded via the QuantERA initiative), was strongly represented both in presentations and via Earl Campbell (Univ. of Sheffield) and Dan Browne (UCL) on the conference programme and steering committees. However the participation from continental Europe wasn’t as heavy as might have been expected given a European venue. Many, including Fact Based Insight, believe that quantum software overall is an area where funding within the EU’s Quantum Flagship could be usefully strengthened.

Actions for Business

Investors in quantum software businesses should follow quantum error correction progress closely:

  • The nature of the quantum software stack in the medium and long term is still uncertain. There will likely be a variety of very profitable niches, not just in quantum error correction but also in the layers closely inter-related to it such as optimising quantum compilers and quantum firmware.
    • Do you understand how this affects the opportunities and risks for your positioning?
    • Does your team include the right skills to understand the evolving dynamics of this field?

Investors in quantum computing hardware should also understand how this area affects milestones and risks around their development roadmap:

  • What relative priority is being placed on commercial progress during the NISQ era, versus the longer term FTQC roadmap? Is the trade off in investment and opportunity cost understood?
  • For teams focussed on long term FTQC opportunities
    • Your roadmap will almost certainly include milestones reflecting progress towards achieving fault tolerance. You will already probably have an understanding of the threat posed by other teams and alternative qubit architectures.
    • However, do you also have an understanding on how novel progress in error correction protocols could disrupt your plans? Developments are not likely to boost all architectures equally.
    • Tailored QECCs may make some technologies more suited for a particular specialist application (e.g. magic state preparation, QRAM, module interlinks).

All participants in the wider quantum technology sector should remain aware of how quantum error correction could enhance their activities.

David Shaw

About the Author

David Shaw has worked extensively in consulting, market analysis & advisory businesses across a wide range of sectors including Technology, Healthcare, Energy and Financial Services. He has held a number of senior executive roles in public and private companies. David studied Physics at Balliol College, Oxford and has a PhD in Particle Physics from UCL. He is a member of the Institute of Physics. Follow David on Twitter and LinkedIn