Quantum hardware in the NISQ era

Excitement mounts as we approach a demonstration of quantum supremacy. However a very long road remains to build a fault tolerant quantum computer

Quantum hardware announcements continue to come at speed:

  • Google has just announced Bristlecone, a 72 superconducting qubit device (though it remains to be seen if it can maintain the high fidelity performance of its previous 9 qubit device)
  • IBM has a 50 superconducting qubit chip (limited test data, though the performance of their 20 qubit version has struggled to match that of the Google 9 qubit chip)
  • Intel has a 49 superconducting qubit chip (though as yet untested, and the prior 17 qubit version was itself only ‘unboxed’ last Oct)
  • Rigetti has a 19 superconducting qubit chip (though the size of their quantum marketing department is not as large as the tech majors)
  • Alibaba has an 11 superconducting qubit device (a development that really promises to shake up the race)
  • QUTech and UNSW scientists have separately demonstrated two qubit gates with spins in silicon (though currently with somewhat limited fidelity), with links to Intel and startup SQC respectively.
  • NQIT scientists have demonstrated trapped ion qubit gates with speeds improved x20-60 over their previous best (though still slower than superconducting qubits). Startup IonQ is proposing a trapped ion based device.
  • The Majorana modes underlying Microsoft’s proposed topological qubits have been demonstrated in nanowires (though that’s still several steps behind rival technologies)
  • Startup PsiQuantum is proposing a photonics based device.

Later this year many expect Google will achieve a demonstration of quantum supremacy, a computation not practical on any conventional computer. We are about to enter a period where noisy intermediate-scale quantum (NISQ) technology is a reality.

This progress is of course exciting. However it’s sobering to remember just how long a road remains to deliver truly large scale, programmable, fault tolerant quantum computation. To win this race groups will have to deliver across a complex multi-disciplinary series of milestones. These include:

1.      Create high quality physical qubits

  • Qubit state preparation and measurement with high fidelity (low error rate)
  • Operation of single qubit quantum gates with high fidelity
  • Operation of two qubit quantum gates with high fidelity – the most demanding challenge

Max 1% error rate (99% fidelity), but 0.1% is probably a better goal. Only trapped ion qubits have exceeded the latter, but superconducting qubits are close behind.

2.      Operate 2D multi-qubit devices

  • At least 2D connectivity between adjacent qubits
  • Consistent worst-case qubit and gate quality

Only superconducting qubits have got past a handful of qubits, but all have so far struggled to maintain worst case quality as they have scaled up.  Once these problems are solved such a device is likely to demonstrate quantum supremacy.

3.      Apply error correction to achieve fault tolerance

  • Logical qubits protected from physical qubit errors
  • Operation of quantum gates on logical qubits protected from errors

Due to the no-cloning theorem, qubits cannot simply be copied, so this is a much more significant challenge than for conventional computers. The mainstream approach is to assume implementation of an error correcting protocol from the surface code family, supplemented by the injection of magic states to complete the set of quantum gates that can be implemented.

Topological qubits may have a massive advantage here.

4.      Develop a universal device architecture

  • Modular control electronics
  • Comprehensive instruction set architecture
  • Efficient device microarchitecture

Silicon spin qubits may offer more flexible microarchitecture options if the technology can be proven.

5.      Scale up the device

  • Qubit numbers (either monolithically on-chip and/or by coherently linking separate devices)
  • Control electronics
  • Environmental protection (e.g. cryogenics, vacuum etc)

Silicon spin qubits may have an easier time here, at least while it is possible to stay within a single device. Trapped ion based devices also have the significant advantage here of only requiring mild cooling. However superconducting qubits have the challenge of how to scale within the constrains of ultra-cold dilution refrigeration.

6.      Optimise specialist sub-components

  • Magic state production (an anticipated bottleneck for approaches that use error correction)
  • QRAM (quantum RAM is a crucial issue in key areas of interest e.g. Search, Deep Learning)
  • Photonic links to a quantum internet

A wider range of qubit technologies may find advantage in these niches.

All of the above are very significant challenges in their own right. Other potential roadblocks are likely to emerge as progress continues. Leading groups have plans for how they want to address many of these, but in summary they are still wrestling with how to delivery adequate quality on multi-qubit devices.

There are potential wild cards. If Microsoft’s topological qubits can deliver on their promise of ultra-high fidelity they may radically reduce the massive error correction overhead others face. If silicon spin qubits can retain high fidelities with temperatures in the 1-4 Kelvin range then that is a warm summer day compared to the 10-20 milliKelvin devices currently operate at. If a prototype of one of the NQIT trapped ion devices (Q20:20 engine or Sussex blueprint) can simultaneously demonstrate high fidelity, scalable control and multi-device coherence, the relative ease of only requiring a hard vacuum rather than cryogenic environment could bring the other advantages of ions back to the fore.

The coming years are not likely to be technical and inward looking however. There are a wide variety of strategies in the NISQ era available for companies seeking to gain experience and advantage with early quantum devices. The battle to dominate software in the NISQ era is in particular likely to be lively.

It’s easy to see why many judge the end goal of a fault tolerant universal quantum computer as many years, perhaps decades away. We can expect plenty of action in the meantime.

David Shaw

About the Author

David Shaw has worked extensively in consulting, market analysis & advisory businesses across a wide range of sectors including Technology, Healthcare, Energy and Financial Services. He has held a number of senior executive roles in public and private companies. David studied Physics at Balliol College, Oxford and has a PhD in Particle Physics from UCL. He is a member of the Institute of Physics. Follow David on Twitter and LinkedIn