Quantum supremacy – a new era?

Google claims quantum supremacy – the ability to perform a calculation beyond the reach of conventional computers. This is a landmark scientific, mathematical and technical achievement. But as many such as IBM point out we should all beware of hype. The road to general purpose quantum computing remains a long one.

Google has published a paper in the prestigious scientific journal Nature claiming that it has demonstrated the theoretical concept of quantum supremacy [70]. Such an announcement has been long prepared by Google on the academic conference circuit, and more recently hotly anticipated following a leak of the draft paper. Some, including IBM, have sought to question the significance of the result and cautioned over the hype surrounding the use of the quantum supremacy term. However, Sir Peter Knight, Chair of the UKNQT programme, speaking on the BBC commented “I think this is a real breakthrough. We have been expecting the John Martinis group to make this announcement for some time. I find [the paper] compelling”. Scott Aaronson, a peer reviewer of the Nature paper and long standing quantum supremacy expert, has written in the New York Times underlining the significant of the milestone Google has achieved.

Google Sycamore Processor

Sycamore Credit: Google

The term ‘quantum supremacy’ was coined by Caltech professor John Preskill in 2012. It deliberately emphasises not just speed, but that quantum computers are running algorithms that are radically superior to those available on conventional machines. It has long been thought that 50+ qubits were enough in principle to demonstrate quantum supremacy provided that they could be sufficiently well controlled. Google appears to have been able to achieve this using a 53 superconducting qubit system codenamed Sycamore rather than their larger 72-qubit Bristlecone system.

For those new to quantum read an Introduction to Quantum Technology

However the actual calculation that Google has demonstrated it can perform is in fairness of quite limited scope. What Google’s supremacy algorithm does is to calculate random numbers, but in a specifically controlled way so that the results are uniquely distributed. Sycamore can do this calculation in 3 minutes 20 seconds, a task that the paper claims would take today’s most advanced supercomputer 10,000 years to complete.

For more on Google’s approach and its limitations read Looking beyond quantum supremacy.

Better conventional algorithms, tricks and improved hardware can help future classical computers speed up. IBM have published as response  that significantly takes the shine off Google’s announcement. A key constraint for conventional computers seeking to directly replicate quantum calculations is that the amount of RAM required rises exponentially with the size of the problem. When the conventional computer runs out of memory space it must then trade-off with alternative approaches that take much longer to execute. In a real tour de force by their algorithms team IBM have shown that in this case they can circumvented the RAM problem by using hard drive storage (and other tricks) for parts of the calculation. Rather than 10,000 years, IBM claim they can perform the calculation conventionally in just 2.5 days (and with the possibility of further speed-ups to follow).

At 3 minutes 20 seconds the Google processor is still over 1000 times faster than the IBM result. The scientific and engineering significance of the Google progress is not really affected and quite possibly the Sycamore chip will never be caught in practice by a conventional device. However the term ‘quantum supremacy’ has also been about capturing people’s imagination and this will sound much less impressive to the public. Google may very well have known that to really put the issue to bed they needed a larger 72-qubit demonstration. Unfortunately their Bristlecone device doesn’t appear to have performed well enough, so far, to achieve this.Nevertheless Scott Aaronson, highlights the underlying significance of Google’s achievement “we’re comparing ~5×109 quantum gates [Google] against ~2×1020 floating point operations [IBM] – a quantum speedup by a factor of ~40 billion”.

Fact Based Insight would summarise as follows. Google have indeed demonstrated a key milestone that they have long set out. This is a very significant scientific and engineering achievement and it meets the mathematical spirit of John Preskill’s original test – the calculation cannot be efficiently simulated on a conventional computer (even if it fails to completely knock it out of the park). IBM have reminded us just how close we can still run with the reservoir of know-how and brute force behind conventional hardware. This will help us all not to be misled by hype surrounding the ‘supremacy’ term.

Hype notwithstanding, demonstrating a calculation that formally overturns the extended Church-Turing thesis, a central tenant of conventional computing, is a seminal event. Google’s paper also provides experimental support for the ‘digitization of errors’, a key assumption in the theory of quantum error correction and arguably the last remaining real scientific doubt over whether large scale quantum computers can in principle be built. Assuming these results stand the test of time, Fact Based Insight believes they will ultimately earn project leader John Martinis a Nobel Prize.

The NISQ era

Success in demonstrating quantum supremacy means that we have finally truly entered the NISQ era (another term popularised by John Preskill). However what these early quantum devices will be able to achieve remains far from clear.

For a review of the short and long term prospects read Quantum software – over-hyped but underestimated.

Remarkably there is a short-term commercial application to which Google’s approach may be well suited – the provision of certified public random numbers. Random numbers are important for many commercial applications from encryption and cybersecurity to lotteries, gaming and election auditing. However we normally have to trust the provider of the random number to be behaving honestly, a significant limitation even for existing QRNG devices. Based on his work at Univ. of Texas at Austin, Scott Aaronson has proposed a novel application for Google’s early quantum hardware that allows random numbers to be ‘publically certified’ through a challenge and verify process. Many, even seasoned commentators, are missing that this is a new cryptographic resource.

For a detailed discussion of one potential application of certified public random numbers read Quantum enhanced blockchain – sooner than you think.

One often discussed future application of quantum computers is to break the public key encryption on which Internet security currently depends. Fortunately, this is expected to be beyond NISQ era devices. Google’s success does not change that.

For an explanation of why business should still act now read A current business threat from future quantum computers.

For a review of the world’s preparations for this threat read Quantum safe cryptography – waiting to save the world.

Peak quantum hype

The next few months will see quantum computing at the very top of its hype cycle. Google’s Hartmut Neven has pointed out that we are currently seeing quantum computing power grow at a ‘double exponential rate’. Google’s paper moots ‘Neven’s law’ as a replacement for the exponential increase in computing power we have traditionally seen under ‘Moore’s law’. However, taken naively this could be very misleading.

To realise their transformative potential we need to radically scale-up today’s devices just so they can operate stably for general calculations – we need to achieve FTQC. Just how many more qubits are required is not certain, but it is likely to be at least in the 100,000’s and probably in the millions. That’s a long way from 53.

The long road to FTQC

Scaling up is a very significant challenge. In reality all quantum hardware groups have struggled to maintain the fidelity of qubit gate operations as they have moved up even to today’s very modest devices (a point that seems to have been confirmed again by Google’s experience with Bristlecone). In practice, controlling ‘cross-talk’ between qubits on these devices is clearly a major issue.

In addition, all currently proposed quantum computing architectures face their own specific medium term scaling challenges:

  • Superconducting qubits – how to scale beyond the confines of the dilution refrigerators required for ultra-low temperatures?
  • Trapped ions – how to manage the proliferation of lasers required for their high performance gates; or how to match that high performance with global field microwave gates?
  • Silicon spin qubits – how to deliver gates of sufficient fidelity while maintaining this technology’s promise of densely packed nanoscale architectures?
  • Photonic qubits – how to deal with photon loss in their alternative scheme of measurement based quantum computing?
  • Topological qubits – how to catch up with a promising idea, but one that still lags well behind in demonstrating practical qubit operations?

It’s important to realise that this is a challenge as much for the full software stack as for the hardware. As Google’s paper itself emphasises “engineering quantum error correction will need to become a focus of attention”.

For a review of some of the leading software approaches read Quantum error correction – from geek to superhero.

For a review of some of the other hardware approaches and what’s going on across the wider quantum technology sector read Quantum Outlook 2019.

Beating quantum winter

Some, such as William Hurley (Strangeworks founder and CEO), fear that the current hype surrounding quantum supremacy will lead to a ‘quantum winter’ for startups and researchers where funding cools following over-inflated promises.  Many are looking to ambitious government quantum programmes and patient deep  tech investment and to escape this danger. Opportunities in the quantum safe cryptography and the quantum imaging & sensing sectors offer an important bridge to early commercial payback.

For more on early commercialisation read  Beating quantum winter: opportunities further up the value chain.

Despite the long road that lies ahead, the whole Google team deserves our congratulations on their remarkable achievement.

Actions for Business

  • Google’s announcement does nothing to accelerate the long timelines typically mooted before quantum computers break current Internet security. However prudent businesses should already have reviewed the specific time horizons of the threats their data and infrastructure face.
  • Businesses that will one day be significantly impacted by the unique new potentials of quantum computing, including financial services, pharmaceuticals, automotive and logistics, aerospace and defence, artificial intelligence, should be planning how to build the required skills within their future teams and partner networks.
  • Partnering and co-funding academic teams is set to be a useful tool in developing contacts for independent technology evaluation work and as a future workforce pipeline.
  • Investors in quantum computing hardware should continue to support a wide diversity of platforms. This will be a long race and it is far from over. Watch out in particular for startups offering enabling parts of the solution with cross platform applicability.
  • Investors should not neglect other opportunities in the wider quantum technology value chain, particularly quantum imaging and sensing and quantum safe cryptography, which potentially have complimentary and earlier payback profiles.
  • Expect government programmes around the world to vie to make their region the most attractive place for investors to grow quantum businesses and for talented individuals to build quantum careers.
David Shaw

About the Author

David Shaw has worked extensively in consulting, market analysis & advisory businesses across a wide range of sectors including Technology, Healthcare, Energy and Financial Services. He has held a number of senior executive roles in public and private companies. David studied Physics at Balliol College, Oxford and has a PhD in Particle Physics from UCL. He is a member of the Institute of Physics. Follow David on Twitter and LinkedIn