Quantum enhanced blockchain – sooner than you think?

Future quantum computers will ultimately force blockchain solutions to adopt strengthened quantum safe crypto protocols. Rationally there remains plenty of time to act, but proponents need to get their transition stories in place to avoid being caught out as we near peak quantum hype.

More positively, the nascent blockchain industry also stands to be among the first to benefit from genuine quantum innovation. Early quantum devices promise to create a completely new cryptographic resource – certified public random numbers.

Blockchain technology uses advanced cryptographic and game theory techniques to make possible a new type of digital technology, the distributed ledger – an immutable record that can be updated and verified without the need for a central authority.

The original and highest profile application of blockchain technology has been in the creation of digital currencies. The hype surrounding Bitcoin and the cryptocurrency bubble of 2017-18 stands alongside the dot-com boom of 1995-2000, the South Sea bubble of 1720 and the tulip mania of 1634-37. But the sceptics of crypto winter should remember that in the end the Internet did reinvent the global economy, the New World was a lucrative trading opportunity and the tulip remains a lovely flower!

The unique technological potential of blockchain goes far beyond cryptocurrencies. Streamlined shared record keeping is finding applications across back offices, supply chains, real estate, healthcare and other sectors. The ability of advanced blockchain applications to embed automated conditional logic (smart contracts) extends its potential scope in financial services, IP rights management and the IoT. In the most expansive version of the vision this leads to a new paradigm for software – a unified world of decentralised applications (DApps) running decentrally across an open network.

Quantum technology is one of the other great disruptive revolutions of the current era. Quantum computing in particular promises to move our capabilities beyond the limitations of binary 1 and 0. Quantum information processing challenges many of the assumption of the current digital world.

How will blockchain technology and quantum technology interact?

The diversity and innovation of the blockchain sector can make it appear dauntingly complex.  Likewise quantum technology can seem impenetrable to the uninitiated. Popular headlines make claims such as ‘quantum computing could break Bitcoin’. But many blockchain enthusiasts seem less concerned. What is the informed picture?

This briefing examines the potential interaction of these technologies across the four key components that make blockchains work:

  • Cryptographic hash functions – the mathematics that ensures chain integrity
  • Digital signatures – the essential guarantee of transaction authenticity
  • Consensus protocols – the true heart of decentralised blockchain magic
  • Public randomness – a underlying vital resource.

Mass blockchain ecosystems also depend on the general security of the Internet. This itself is impacted by the advent of quantum technology. This briefing will focus on those aspects specific to blockchain.

Cryptographic hash functions – unbroken and unbowed

Acting as the workhorse of the blockchain, these mathematical algorithms ensure the integrity of the distributed ledger.

Blockchain structure

Cryptographic hash functions calculate a fixed length output (hash) from a longer variable length input. The hash is effectively a secure and unique ‘fingerprint’ of the input. Crucially it’s extremely difficult to do the calculation in reverse (find the input given just the hash), nor to find two inputs that generate the same hash.

Blockchains use hash functions extensively and in a variety of ways. Within a block header, the inclusion of the hash value of the previous block’s header is a basic safeguard to the data integrity of the chain. Within an individual block of transactions, a tree structure of hashes (a Merkle tree) is typically used to provide an efficiency structure for validation and storage.

Common hash algorithms used by blockchains today include SHA -256, Ethash, SCrypt and RIPEMD-160.

The good news is that no known quantum algorithm provides a strong (exponential) speedup for attacking the integrity of these algorithms. Quantum Grover search may provide a moderate (quadratic) speedup, but that can be countered for the foreseeable future as long as we use a sufficiently long hash size. The ETSI Quantum Safe working group have estimated that SHA-256 will be safe till at least 2050 [12].

Some specific applications may be compromised earlier, for example the hashing of account passwords for storage. But this vulnerability relates to the relatively small number of plaintext passwords in actual use [53] and well-constructed blockchain clients should avoid this threat.

A more subtle challenge could be quantum hash calculations unbalancing competition in the blockchain mining market. We’ll return to this issue later when we discuss consensus protocols.

The blockchain community should be alert to any sudden advances in quantum research that further threaten the security of hash functions (we don’t even have conventional proofs of SHA-256’s hardness after all). However most experts regard this as an outlying risk. Fact Based Insight isn’t aware of current quantum software work making progress in this area.

Digital signatures – a point of vulnerability

The authenticity and integrity of blockchain transactions is guaranteed by the use of digital signatures.

Blockchain transactions

Digital signatures prove that messages were created by known senders. Typically this uses a combination of public key cryptographic and hash functions.  From a random seed, two numbers are created, a private key and a public key, with a hidden association. The digital signature is formed using the hashed message and the private key. A recipient can use the public key to verify that the signature is genuine but not to create new valid signatures. Crucially it must be extremely difficult to find the private key given just the public key.

Digital signatures are an essential component of blockchain transaction validation. The most common algorithm in use is ECDSA and its variants (e.g. EdDSA, BLS).

The bad news is that a well understood quantum algorithm, Shor’s algorithm, is known to break most of the schemes of public key cryptography in use today. In essence, with a large enough quantum computer it becomes possible to calculate the private key based on the public key. Blockchain applications are therefore potentially seriously compromised.

The degree of exposure to this threat is subtle and complex. In some set-ups, for example Bitcoin, as long as each address (public/private key pair) is used only once (a process automated by most good wallets), the vulnerable public key is only exposed for the relatively short period between the transaction being announced to the network and it being finalised. Any quantum attack would have to operate within this relatively short window.  However as we look at continuing blockchain innovation (in particular attempts to improve scalability and transaction throughput) all new protocols are not necessarily equal in this regard. When encrypted data is stored on-chain by advanced DApps, the required security shelf-life also needs to be considered.

When might a quantum computer large enough to mount an attack on existing public key schemes be built? Estimated timescales vary wildly:

  • Some continue to doubt that such devices will ever be built [52], though this is now clearly a minority view in the community [1, 6, 39, 40, 53].
  • The most detailed Bitcoin-specific academic estimate of which Fact Based Insight is aware indicates an earliest date of 2027+ [51].
  • A more general expert group assessment on behalf of the US NASEM concludes ‘not within the next decade’, implying 2029+ [53].
  • An expert group assessment for the German government has conclusions broadly in line with NASEM, though it points to current publically published data being inadequate for a robust extrapolation of a timeline [41].
  • Most assessments include caveats on the potential impact of “a program where an industrialized nation pours a large part of its research and development activities in a single project comparable to the Apollo program and the Manhattan project” [4,41].
  • To further complicate the picture, the potential of analogue quantum simulators is less well quantified and devices of this type retain the potential to further disrupt timelines [33].

Care must be taken in interpreting the dates above. Most experts would agree that the likely timescale for any such large scale quantum computers is probably much further off. However relatively early dates cannot be ignored in the context of prudent risk management.

Fortunately solutions exist to the challenges this presents. Quantum safe cryptography is already a vibrant and advanced field.

The NIST PQC Standardisation process has now entered its second round. 82 original submissions have now been narrowed down to 26 for evaluation over 12-18 months. The 9 candidates selected to provide quantum-resistant digital signatures are: CRYSTALS-DILITHIUM, FALCON, GeMSS, LUOV, MQDSS, Picnic, qTESLA, Rainbow and SPHINCS+

NIST has also just closed a ‘request for comments’ on its intent to approve this year the LMS and XMSS hash-based signature schemes. These quantum-resistance schemes are ‘stateful’ which means implementations need to avoid key reuse and are thus more challenging and prone to misuse. NIST intends to recommend these schemes only for “a limited range of signature applications, such as code signing”. However these schemes can also be adapted for use in a suitably designed blockchain.

Mark Pecen of ISARA (and chair of the ETSI Quantum Safe working group) comments “Today’s public key infrastructure is aging out, even in terms of resistance to attack by conventional computers. This may explain the urgency by certain groups to move to quantum-safe algorithms”.

The problem for blockchain developers is that the new quantum-resistant protocols typically come at the expense of significantly increased key/signature sizes and decreased processing efficiency. Evaluating the security of new protocols and protocols variants is also a very specialist task as a working knowledge of potential quantum attacks is required.

QRL is a notable example of a blockchain that emphasises its existing quantum-resilience based on its use of XMSS signatures.

IOTA, aimed at IoT applications, emphasises its use of WOTS+ a hash-based signature scheme with resistance against known quantum attacks.

HyperCash, focussed on interoperability, also offers quantum-resistant signatures based on lattice based Ring CT protocol.

StarkWare is focussed on applications of the ZK-STARK protocol for zero-knowledge proofs. This privacy oriented crypto technology is resistant to known quantum attacks. ZK-STARKs are part of the Ethereum off-chain (layer 2) roadmap.

The blockchain sector is currently locked in an escalating war over the ‘trilemma’ of decentralisation, security and scalability. Many are cautious about losing their immediate competitive edge by moving earlier than necessary to more onerous implementations. While this is strategically understandable it is a risky game – there are no simple ‘drop-in’ solutions; dormant holdings, where the owner is absent or has lost their key, are a further complication. Decentralised blockchain communities have not typically found it easy to agree and implement change quickly. Smaller coins could find their ability to respond draining away.

Cryptoslate currently lists 938 separate coins. The vast majority of these are not quantum-safe. Fact Based Insight believes that many never will be. An asteroid is hurtling towards this market. It is still far off, but when it hits it will be a mass extinction event.

Navigating peak quantum hype

Many believe that Google may now be on the verge of demonstrating quantum supremacy with its prototype quantum device.  This will be a major scientific and engineering milestone, but it is still very far short of the scale of machine required to break public keys. Such an announcement is a pre-requisite for the early timelines discussed in the previous section; it does nothing to accelerate them. However, this subtly could easily get lost in the excitement.

Any supremacy announcement will likely drive quantum computing to the peak of its hype cycle. There will be renewed public discussion of the threat such devices pose to cyber security. The threat to sensitive conventional centralised databases is actually often more urgent than that faced by typical blockchain applications. However, cryptocurrency enthusiasts are only too well aware of how fickle sentiment can drive markets – upwards and downwards.

The challenge for competing blockchain proponents will be to ensure their own relative positions are not damaged by sentiment during this period. Ecosystem leaders need to respond to questions on their own roadmap to quantum-resilience. Key points to address are not just the technical shortlist of future protocols, but also robust indications that implementation doesn’t contradict other elements in their scalability roadmap and confidence that their community can move promptly to adoption when the time is right.

Speaking at EDCON 2019 [54], Danny Ryan (Ethereum Core Researcher) confirmed “one of our design goals in Ethereum 2.0 is to have a credible path to being quantum resistant in the 3-5 year time horizon”.  Many at EDCON will have been impressed at the community’s ability to maintain focus around the Ethereum 2.0 project. Few however will see it as a ‘simple’ structure. Achieving clarity on how quantum resistance is to be built into this framework needs to address both the technical challenge but also be simple enough to communicate to reassure the wider market. This matters not just for Ethereum but also for the ecosystem of subsidiary tokens and DApps it supports.

Consensus Protocols – a community devided

To maintain agreement as transactions are added across a decentralised network, blockchains must implement a consensus protocol.

In a traditional computing system, a central authority imposes version control and user access permissions. What is really unique about distributed ledger technology is that it seeks to maintain a consensus without recourse to these traditional techniques. When a new block of transactions is added to the chain the consensus protocol must work securely even in the presence of a minority of malicious participants who try to cheat the system. The introduction of the proof-of-work concept in 2008 by the pseudonymous Satoshi Nakamoto was the innovative spark that unlocked blockchain’s remarkable rise.

Proof-of-work (PoW) – transactions from across the network are bundled together by ‘miners’ who compete to finalise a new block by solving a difficult maths problem: they must find the right padding text (nonce) to add to the block so that the calculated hash meets a pre-set constraint. The first to manage this task publishes the new block to the network, automatically incorporating a reward for the miner.

Because mining uses significant computing power, miners are subtly incentivised to focus all their efforts to work only on the current longest valid chain of blocks, or else they risk their effort being wasted.  Miners effectively ’vote with their CPU power’ to achieve consensus.

It is instructive to focus on the true role of ‘mining’ in PoW based consensus. In the early days, popular accounts emphasised the role this process played in adding new coins to the market. However, this incentive can ultimately be replaced by transaction fees. The enduring role of mining is that it ensures randomness in terms of who will be able to create the next block and that there is a real cost to playing the game. Both of these features are key to protecting the security of the consensus process.

Problems with Proof-of-Work

The remarkable robustness of PoW has been key to the rise of Bitcoin and other leading cryptocurrencies such as Ethereum, Bitcoin Cash and Litecoin. Over 75% of current cryptocurrency market capitalisation is in coins that use PoW.

In principle quantum computers might disrupt the economics of the mining market. However as the calculation is hash based, they offer only a modest saving in computational steps. Also in this application they are facing highly optimised conventional algorithms and dedicated hardware with clock speeds far faster than any currently envisaged quantum device. Specifically for Bitcoin mining, estimates have typically pushed impacts on the mining market out to 2050 and beyond [53].

However as PoW cryptocurrencies have grown other significant problems have emerged:

Power consumption – mining is competitive and computationally intensive, it consumes an enormous and growing amount of power. PwC estimates that in 2018 Bitcoin miners consumed as much electricity as the country of Hungary!  This is both expensive in terms of input costs, but also wasteful as the work that is being done is otherwise ‘useless’.

Mining concentration – the advantages of scale and specialist hardware (e.g. ASIC mining) has led to a concentration of mining power and high barriers to entry. This threatens the security and perceived fairness of the network.

Scalability – many blockchain networks face the challenge of how to expand the number of transactions per second (tps) they can support across the network. The limitations of PoW is one factor contributing to this problem.

Greta Thunberg meets UK politicians

Climate activist Greta Thunberg (right) meets UK politicians Michael Gove (Con), Layla Molan (Lib Dem) and Ed Milliband (Lab)

Concerns around these aspects of PoW have been around for several years but the shear pragmatism of the protocol has maintained its strong momentum.  However Fact Based Insight believes its time is now drawing to a close. A fresh wave of global concern to act to limit climate change is growing. When mainstream politicians begin to talk about the need for us to eat less meat and to move away from petroleum fuelled cars … abandoning PoW cryptocurrency mining starts to look like a very low hanging fruit.

To understand a new emerging opportunity where quantum technology may be able to help, we first have to understand the direction that the blockchain sector has already started to pursue in response to these challenges.

Proof-of-Stake

If we try to apply naïve common sense to the consensus problem, we might simply ask ‘why not just choose at random who gets to validate a batch of transactions and create the next block? If they try to cheat then the validator following them can spot the error and correct it’. Proof-of-stake protocols build on this simple principle, but add safeguards to protect against more subtle attacks by malicious participants. Specific approaches vary widely.

Proof-of-stake (PoS) – a broad family of consensus protocols based on choosing the ‘validator’ of the next block randomly across participants with an economic ‘stake’ (in the original idealised conception, simply randomly across all coin holders).

Practical PoS protocols must protect against attack by malicious participants:

Nothing-at-stake – Typically validators must stake a deposit on the blockchain, which they risk losing (being ‘slashed’) if they are caught trying to cheat.

Long range attacks – Dishonest validators must be prevented from starting a completely false chain from a distant point in the past.

Public randomness – a mechanism is required to provide a public and unbiasable way of randomly selecting validators. Many pseudo-random techniques can be biased by ‘block withholding’, ‘stake grinding’ or ‘last revealer’ attacks.

There are many different approaches to PoS typically blending aspects of three key themes:

Chain-based PoS – focusses on the forking rules by which a chain can be demonstrated to be the correct choice, and so dishonest validators can be punished if they choose to build on the wrong choice. This is conceptually elegant, but can be computationally intensive.

Byzantine Fault Tolerant PoS – adds the notion of a committee of validators voting on the proposed block before it is finalised. This adds complexity but has the significant upside of achieving immediate ‘finality’ on the selected block.

Delegated PoS – allows normal participants to appoint delegates who take part in the block creation and validation process on their behalf. This simplifies implementation and improves transaction throughput but at the expense of introducing a degree of centralisation into the network.

Within the current top-20 blockchains by coin market capitalisation, EOS, Binance Coin, Stellar, Cardano, Tron, Dash, Cosmos, Tezos and NEO all use some variation of PoS ideas (though the terminology can vary).

Balancing the trilemma

Emotions run high across blockchain communities concerning the trade-offs implicit in many consensus schemes. Vitalik Buterin (founder of Ethereum) drew raucous cheers from the audience at Blockchain Connect earlier this year – “when a blockchain project claims ‘we can do 3500 tps because we have a different algorithm’, what we really mean is that we are a centralised pile of trash that only works because we have seven nodes running the entire thing!”.

Ethereum is comfortably the second largest coin after Bitcoin and the leading ecosystem for subsidiary tokens and DApps. It has historically been PoW based. However Ethereum has created a buzz with the long awaited launch of testnets for Ethereum 2.0, which includes an initial implementation of its own PoS approach.

Ethereum logoEthereum 2.0 is an expansive vision introducing not just PoS but also ‘sharding’, another scalability friendly feature. The new PoS Beacon Chain will run alongside the existing PoW 1.0 chain and in later phases provide co-ordination across multiple ‘Shard’ chains. The Beacon Chain will initially use the Casper FFG PoS protocol to lock-in checkpoints with the PoW chain. Later in 2.x we can expect to see this augmented by the more elegant CBC Casper PoS protocol. The Beacon Chain takes its name from its original core function – to provide the random numbers required to underpin the security of these protocols [54].

In honesty, Fact Based Insight had thought that such an ambitious transition was beyond what is possible with a decentralised development. However the Ethereum community seem to be in the process of proving us wrong and we eagerly await the launch of 2.0 phase 0 later this year. Blockchain communities have not previously been notable for their ability to manage change without ‘forking apart’. Assuming Ethereum can indeed complete this launch it will be a positive mark for them and an example to the sector. However the long timescale that has been required, and indeed still remains to run, is a useful marker for others planning significant transition projects, such as the introduction of quantum–resilience.

The Catalyst Network seeks to solves many of these same problems with the benefit of a new ground-up distributed ledger design. Developer Atlas City emphasises the benefit of being able to learn the lessons and avoid the mistakes of earlier platforms. To favour efficiency and scalability the protocols avoids PoW and instead employs a new consensus protocol based on decentralised voting (Fact Based Insight understands this to be a variation of Byzantine fault tolerant PoS principles)[65].

Public Randomness

The reader will have perhaps spotted how the need for random numbers has cropped up repeatedly in this discussion.

  • Randomness is a crucial resource in protecting the security of consensus protocols, and PoS in particular.
  • Random numbers are a crucial ingredient to any form of key based cryptography and associated digital signature protocols.
  • Random numbers are also needed as an input to many software applications (e.g. analytics, lotteries, games, voting audits). Blockchain DApps will be no exception.

But providing ‘public’ random numbers across a decentralised network that may contain malicious participants is itself a challenging problem. The ideal source would produce numbers that are not just unpredictable but also unbiasable and unstoppable. Mathematical algorithms only produce pseudo-random numbers that can be predicted once the seed value is known, existing public services (such as the NIST Randomness Beacon) are points of central trust (an anathema to blockchain enthusiasts). Even with a quantum random number generator producing true random numbers, we have to trust both the organisation running the hardware and the validator that chooses to use ‘that particular’ random number. Significant research interest from the blockchain community has started to tackled this problem [55].

Randao is a blockchain based random number service and represents a new generation of randomness solutions based on a multi-party commit and reveal approach: the final random number is formed by combining multiple input random numbers from different participants.

However, even multi-party approaches are subject to ‘last revealer’ bias, where the last party may withhold their number once they see its potential effect on the outcome. To counter this problem Ethereum 2.0 plans to introduce a new innovation, a Variable Delay Function (VDF) to obscure the result of the reveal until it is too late for any party to change their mind and influence the result. To pre-empt players that might seek to gain an advantage in this new game, Ethereum plans to design dedicated ASIC VDF hardware to giveaway to the community (initial R&D cost is planned to be $15m).

Certifiable Public Random Numbers

Measurement outcomes on quantum processes produce true random numbers ‘guaranteed by the laws of physics’. Is there a way to overcome the problem of having to trust the operator of such a device and to overcome selection bias? Intriguing new work suggests there may be.

Physicists explain quantum mechanics by pointing to the mysteries of superposition and entanglement [1].

Philosophers ponder its intrinsic indeterminacy and inconsistency with local determinism [10].

Mathematicians however are likely to point out that it’s merely the inevitable framework we get when we want a theory that operates beyond the limitations of conventional L1 norm probability theory [19].

Though quantum measurement outcomes are ‘random’, the sample distributions they create are uniquely ‘quantum’ and it’s believed that they can’t be efficiently simulated on a conventional computer – a point that is at the heart of Google’s proposed quantum supremacy proof [24].

A number of groups have worked on the problem of how to certify quantum randomness, notably including the BCMVV scheme [56]. However recent work by Scott Aaronson (Univ. of Texas at Austin) [40] is particularly interesting as it may be implementable even on the early NISQ devices now coming in to operation.

Outline of the Aaronson certified quantum randomness protocol:

Scott Aaronson

Scott Aaronson, Univ. of Texas at Austin giving an interview to Y Combinator the US seed funding specialist.

  1. We remotely request a quantum device to calculate the output of a series of pseudo-random quantum circuits; the device sends back a set of random quantum results
  2. We constrain the response time window so that no conventional device could have simulated a true quantum calculation
  3. We test the statistical properties of a sample of the  outputs to verify that a quantum calculation has indeed been performed
  4. We feed the quantum outputs into a conventional randomness extractor to get the required random number

Not only have we produced a public random number, but the test ‘certifies’ its providence.

Several barriers remain. No device has yet demonstrated quantum supremacy in practice. Neither have details of the scheme been formally published. Proving that an ‘even more powerful’ quantum device couldn’t fool the test remains a subtle challenge. Commenting on how well the soundness of the scheme has been validated Aaronson says “we have to make a strong computational hardness assumption, but it looks like a plausible assumption”. Google is already talking about offering a commercial certified public random number service based on this approach [49].

Proof-of-Quantum

The point is not that the Aaronson scheme is a drop-in solution to all PoS challenges, it isn’t. But it is notable that even the leading math-wizards at Ethereum have been forced back into a hardware solution to implement VDF. This seems to indicate that there is plenty of scope within the growing PoS blockchain sector for innovations based on this new crypto primitive capability.

With deep pocketed players like Google having a strategic interest in demonstrating leadership in the commercial quantum computing race there are strong factors geared to making this happen.

Actions for Business

Businesses investing in blockchain solutions need not be asking for immediate quantum resilience, but they should be seeking a clear transition roadmap:

  • Are the current points of quantum-vulnerability for this specific blockchain understood?
  • Is there a credible shortlist of potential quantum-resistant adaptations for this specific case? Are the outline pros and cons of each understood (e.g. longer key/digital signature sizes)?
  • Where existing or proposed features already claim quantum-resistance, can their security promise be mapped to emerging standards (e.g. the NIST PQC process)?
  • Where a blockchain emphasises specific technical features (e.g. PoS or high transaction throughput) can we justify in broad terms why these are not compromised by potentially required adaptations?
  • How technically complex will implementation work be? If protocols adaptation is required, who has the skills to validate that quantum-resistance hasn’t been compromised?
  • Can we point to a reasonable justification or precedent for how quickly the ecosystem around this particular blockchain will be able to agree and implement change?

Proponents of particular blockchain solutions need to be ready to answer the above questions. They also need to consider their own perspective:

  • What are the disadvantages of moving too soon to introduce future-proof changes? What are the indicators that will trigger a need to start active implementation?
  • How can we draw more individuals with specific quantum skills and experience into our ecosystem? How best can we exploit new opportunities such as certified public random numbers or other innovations that will emerge?
  • How are competing blockchains responding to these challenges? How are traditional centralised data solutions responding? When will it be necessary for us to communicate more on this to stay ahead of market expectations?
  • Can we articulate our roadmap to quantum-resilience in a way that is both simple enough to be understood and technically robust enough to be credible?
  • Are we ready with a ‘fire drill’ response in the face of a potential shock to public confidence?
David Shaw

About the Author

David Shaw has worked extensively in consulting, market analysis & advisory businesses across a wide range of sectors including Technology, Healthcare, Energy and Financial Services. He has held a number of senior executive roles in public and private companies. David studied Physics at Balliol College, Oxford and has a PhD in Particle Physics from UCL. He is a member of the Institute of Physics. Follow David on Twitter and LinkedIn

10 comments

  1. David Shaw
    David Shaw -

    Facebook have announced its support for Libra. This proposes to use a ‘permissioned’ PoS consensus mechanism – a combination of Byzantine PoS and Delegated PoS concepts.

  2. Pingback: Quantum supremacy - a new era? – Fact Based Insight

  3. Pingback: Quantum safe cryptography – waiting to save the world – Fact Based Insight

  4. David Shaw
    David Shaw -

    Google’s announcement of quantum supremacy based on its Sycamore processor remains exactly on track with the roadmap assumed in this article. The potential early application of NISQ processors in providing certifiable public random numbers for proof-of-state blockchain is referenced by Scott Aaronson in his recent New York Times article.

  5. Pingback: Quantum Algorithms Outlook 2020 – Fact Based Insight

  6. Pingback: Quantum Internet Outlook 2020 – Fact Based Insight

  7. Pingback: Quantum safe cryptography - the big picture – Fact Based Insight

  8. Pingback: Superconducting qubits – Fact Based Insight

Leave Comment