Quantum safe cryptography – the big picture

Recently updated advice on quantum security technologies from the UK NCSC is bluntly sceptical about the promised benefits of quantum key distribution. This seems at odds with the scramble of governments around the world to promote the development of quantum networking technologies. Who is right?

Future large scale quantum computers will one day bring many great benefits. Unfortunately they will also break many of the cryptographic protocols on which current Internet and business network security depends.

The world needs quantum safe cryptography and many groups around the world are already working hard on solutions to this challenge. The new tools being developed include the new maths-based algorithms of PQC, supplemented by new physics-based approaches such as QRNG and QKD.

Quantum Cryptography – Quantum states exhibit intrinsic uncertainty; this is the basis for QRNG. Quantum states cannot be cloned; thus an eavesdropper cannot simply copy and store quantum communications and remain undetected, this is the basis for basic QKD. Quantum states can be entangled regardless of distance; this unlocks more advanced QKD protocols and wider capabilities in networked quantum computing.

The proponents of QKD point to its unique security promise, guaranteed by the laws of physics. This they argue is a natural complement to the traditional security promise of maths-based cryptography, guaranteed by the computational difficulty of certain maths problems.

Physicists and mathematicians have been arguing about the relative merits of QKD and PQC for years. The UK NCSC published its initial view on QKD in 2016 [ ]. It was generally critical of early QKD technology then on offer, pointing to practical vulnerabilities and difficulties in deployment. In 2020 the NCSC has published an updated whitepaper [ ]. In this it drops mention of many of its previous objections but focuses on just one – authentication. It presents this as a significant in principle objection to all use cases for QKD.

NCSC – The UK National Cyber Security Centre is part of the widely respected GCHQ security organisation. With a heritage that goes back to Bletchley Park’s wartime code breaking success, GCHQ has played a leading role in the development of modern cryptography and is home to many of the best maths brains in the country.

The NCSC seeks to be a trusted, neutral arbiter on all matters of practical cybersecurity practice. However how easy is it for such an important part of the maths-led cybersecurity establishment to achieve that in this case?

To understand the heart of the NCSC objection over authentication we must delve into considerable detail.

Signatures, keys and encryption

To protect the confidentiality of data being sent over a network we encrypt the data before transmission.

Everyone agrees that perfect ‘information-theoretic security’ can be guaranteed by using a one-time pad (OTP). The old way to do this is the stuff of spy novels. A trusted courier delivers a unique random key to each operative in the field. Each bit of the key is used once and only once in the encryption of the secret message.

The modern networking revolution has been unlocked by us accepting a lessor standard of security. For most applications today we use ‘computational security’ based on the difficulty of solving hard maths problems. This is a very flexible technique, allowing modern cryptographers to develop ‘in-band’ solutions for many of the challenges facing modern computer networks. It’s the current versions of some of these tools that are threatened by the advent of quantum computers.

Private-key (symmetric) cryptography – Both parties must start-out sharing the same secret key. Common algorithms used today include AES and SHA. These are weakened, but not broken by the threat from quantum computers. We can compensate by moving to larger key sizes.

Public-key (asymmetric) cryptography – Public keys can be shared freely, while each user retains a secret key. The tremendous flexibility this unlocks is central to the convenience of the modern Internet. Unfortunately algorithms in common use today, such as RSA and ECDH are known to be completely broken by future quantum computers.

Development of new maths-based post-quantum cryptographic protocols thought to be resistant to quantum attack has been underway systematically since 2006. This has proceeded with renewed vigour since 2016 through an evaluation and standardisation process led by the US National Institute for Standards and Technology. New protocols are being evaluated for public-key digital signatures (PQC DS) and key encapsulation mechanisms (PQC KEM).

NIST PQC process – 26 candidates are currently in round 2 evaluation [ ]. Selection for round 3 is underway and an announcement expected in the next few months. This final round is expected to complete in 2021 and we can expect draft standards in 2022 with finalisation by 2024. The result is not expected to be a single recommendation, but more a collection of approaches fitted to different use cases: some streamlined to minimise key size and improve performance, some offering the strongest possible security promise.

To understand the potential applications of PQC and QKD it is useful to review end-to-end use-cases. These need to deal with three challenges: authentication, key agreement and message encryption.

Authentication – Proving who a message comes from and that it hasn’t been tampered with in transit; for example via a digital signature.

Key Agreement – Securely sharing an encryption key between the parties; for example via a key exchange or key encapsulation mechanism.

Message Encryption – The actual encoding of the data to be transmitted; for example via a block cipher or stream cipher.

QKD itself only provides key agreement. The question is, if we’re going to use maths-based computational security techniques for authentication and/or message encryption then doesn’t that defeat the point of using QKD for just that step?.  Michele Mosca (IQC) answers “No it doesn’t. There are a series of trade-offs between risk and convenience. The computational methods add convenience, but using information theoretically secure key agreement still removes significant risks”.

Use cases

Let’s look at what Mosca is talking about [ ]  and walk through a fine-grained version of the different use cases with concrete algorithms. For the moment, following the thrust of the NCSC whitepaper, we’ll put aside concerns about practical hardware vulnerabilities.

1. Current Internet security – typical case

RSA 2048 + ECDH 256 + AES 128

Today a typical protocol suite might combine authentication via RSA (2048 bit key), key exchange via ECDH (256 bit key) and message encryption via AES (128 bit key).

Future quantum computers threaten the security of each underlying protocol in this suite. This future threat already applies today to data intercepted and stored for future decrypt when a sufficiently large quantum computer becomes available.

Dual lock: Though fated to be broken, our implementation of these algorithms is also tried and tested. Prudent approaches to introducing new protocols we will continue to run current ones in parallel during an extended transition.

2. Future all-maths based ‘streamlined’ PQC for the Internet

PQC DS + PQC KEM (c.1100 bytes) + AES 256

To minimise latency and promote backward compatibility we select from the most streamlined PQC digital signature and key exchange protocols. These minimise the increase in key size and processing overheads. AES is retained for final message encryption, but with key size increased to 256 bits.

The long term security of this case rests on the enduring security of sufficiently streamlined PQC KEM protocols.

Google recently ran successful trials of two PQC algorithms as a proof in principle that this approach can work (more on this later). 

3. Future all-maths based high security PQC for critical applications

Strong PQC DS + Strong PQC KEM (c.10,000 bytes) + AES 256

For higher security applications we will prefer to use the PQC protocols with the strongest possible security promise. Significantly larger key sizes and processing overheads are likely to be required.

The long term security of this case rests on the enduring security of the strongest PQC KEM protocols.

Stream cipher – the length of message encrypted can be extended by using techniques such as the AES-GCM. However if we start with a key from a PQC KEM then long term security still rests on the enduring security of the initial key exchange.

4. Fully in-band enhanced security using hybrid PQC/QKD

Strong PQC DS + QKD + AES 256

In this scheme, QKD replaces maths-based key exchange. Initial authentication is still provided by a strong PQC digital signature protocol, retaining the ease of deployment that brings.

We now only have to rely on the security of the PQC DS protocol in real-time during initial authentication. Long term security rests on the enduring security of only the AES protocol.

Long term security:  It’s important to understand the special physical nature of the QKD security promise. An adversary cannot copy and store the quantum data exchanged during key formation without being detected. If the QKD protocol isn’t broken in real-time, it can’t be retrospectively attacked based on stored data. Note (i) this is a different and stronger notion than the forward secrecy defined for some maths-based protocols [ ].

Stream ciphers:  can’t match this use case as they would need out-of-band delivery of an initial key.

5. Fully in-band ‘conditional’ perfect security

Strong PQC DS + QKD + OTP

If instead of AES encryption, we use QKD to feed OTP encryption, we can achieve security that removes all long-term dependence on computational hardness assumptions. We only rely on the security of the PQC digital signature protocol in real-time during initial authentication.

The low secret key rates offered by present day QKD systems make this a painfully limited option for now. However it is already a unique one, with no dependence on a physical courier to deliver key media and no requirement for subsequent secure key storage.  QKD offers the unique promise of a fully ‘in-band’ solution always able to provide fresh key material. This could appeal to low volume users who are keen to avoid reliance on physical couriers (or other physical alternatives for key delivery).

6. Information-theoretic security

ITS DS + QKD + OTP

Conventional maths-based digital signature schemes based on private-key cryptography exist that offer (almost) perfect information-theoretic security [ ]. The draw-back of such schemes is that each party must initially start with a pre-shared secret key.

Having to work with pre-shared keys is a significant constraint. In practice it might represent a physical seed key being built into a device (e.g. a satellite) at manufacture. This solution must complete head-on with what could be achieved using a physical courier or other ‘out-of-band’ solution.

This is ultimately the configuration upon which QKD’s headline marketing claim to offer perfect information-theoretic security lies. It currently faces stiff competition from out-of-band solutions in situations where the security of physical logistics and storage isn’t a concern. Physical couriers can deliver a large volume of key material in a single visit.

Stream ciphers – can provide competition here by being used to expand the pre-shared initial key. However they don’t provide information-theoretic security as they depend on the enduring security of the maths-based pseudo-random extension of the key.

In cases 4, 5, and 6, QKD does seem to offer distinct security improvements by making fine grained distinctions over the supporting assumptions on which we must rely. How much do these really matter? The NCSC whitepaper invites the casual reader to see them as broadly equivalent. To assess this, again we have to dive more deeply.

Computational security

The tremendous success of modern maths-based cryptography has been driven by the innovation of basing security on computational hardness assumptions. We thus speak about it being computationally infeasible to break the security of the system, rather than it being impossible.

P vs NP

One of the most famous open problems in mathematics is to establish the relationship between two theoretical classes of maths problem [ ]. A resolution to the status of this relationship would have fascinating philosophical implications. It could also have a very practical impact on maths-based cryptography.

P – Problems that can be solved ‘easily’ by a conventional computer (in polynomial time). Colloquially: not good for crypto, as susceptible to a brute force attack.

NP – If you give it a known solution, a computer can easily check it (in polynomial time). Colloquially: useful for crypto, if you give me the key I can decode the message efficiently.

NP-complete – the hardest problems in NP. Colloquially: the ideal gold standard for crypto, but difficult to achieve with practical algorithms.

P vs NP – it is a widely believed conjecture that P is not simply the same as NP, which mathematicians write as P ≠ NP. Thus NP problems can’t easily be cracked by conventional computers. However there is no known mathematical proof of this. Resolving this is one of the famous open Millennium Problems in mathematics.

BQP – Expands P to include problems that can be tackled efficiently by a quantum computer. BQP is conjectured not to include NP-hard problems, but again this is not proven.

Incompleteness – In the 1930’s the founding fathers of quantum theory were overthrowing conventional physics. Gödel, Church and Turing were putting in place equally revolutionary ideas in mathematics. Their work showed that certain mathematical statements are undecidable. P ≠ NP is an assertion about how their work applies to real computers running for reasonable amounts of time.

With P ≠ NP as the starting point, we can start to build crypto systems based on the hardest NP problems we can tame to our needs. However it turns out that just P ≠ NP is not enough, we also need additional assumptions [ ].

Average vs worst case hardness – it’s no good if our crypto system is hard to crack in just some cases, it has to be hard to crack in almost any case. Many straightforward NP-complete problems fail at this step [ ].

One-way functions – we need an efficient way to uniquely scramble any message. This is the basis for private-key cryptography [ ] and also turns out to be sufficient for public-key digital signatures, and hence in principle for strong PQC DS protocols [ ].

Trapdoor one-way functions – it’s also useful to have one-way functions that are easy to unscramble if we know an additional secret. (More strictly we need a trapdoor predicate). This is the basis for public-key encryption, and hence for PQC KEM protocols [ ].

Additional structure – to make real world PQC algorithms more streamlined (for example allowing smaller key sizes and more efficient processing overheads) we also typically introduce additional mathematical structure to the problem [ ].

The vast majority of computer scientists believe that P ≠ NP, and can perhaps cite support from observed (but unexplained) trends in practical progress on algorithms [ ]. In mathematical terms this remains a conjecture, though leading computer scientist and quantum thought leader Scott Aaronson has a telling quip “If we were physicists, we would have declared P ≠ NP to be a ‘law of nature’ and given ourselves Nobel Prizes … and later if it turned out that in fact P = NP we could give ourselves more Nobel Prizes for the law’s overthrow!”.

But P ≠ NP apart, we also have no proof that hard to crack functions exist with the increasingly precise follow-on assumptions listed above. Cryptographers have worked in practice with many candidate problems including integer factorisation, discrete logs, error correcting codes and more recently the shortest vector problem in lattices and others. Some of the fledgling PQC protocols under consideration by NIST, such as Classic McEliece are well known and have survived attempts to attack them for 30 years. Others such as the several lattice-based crypto variants show much wider cryptographic promise but are much newer and less tested.

Mathematical innovation

The advent of quantum computing has focussed attention on the vulnerability of certain traditional computational hardness assumptions. Research into new quantum algorithms continues apace. However it’s important to realise that most experts do not expect the power of such algorithms to be unlimited.

Aaronson-Ambainis – It’s widely believed that the strongest (super-polynomial) quantum speedups only occur for ‘structured problems’ [ ]. In this case ‘structure’ means some underlying regularity (periodicity) within the data. Despite a false alert in 2019, this conjecture remains unproven.

Quantum innovation is not the only consideration. Conventional mathematical innovation is equally relevant. The history of the hidden subgroup problem [ ] is an interesting example of both types of innovation proceeding in parallel.

The Hidden Subgroup Problem
[Definition] Let G be a group, X a finite set, and f : G → X a function such that there exists a subgroup H < G for which f separates cosets of H. Using information gained from evaluations of f, determine a generating set for H

[loosely] A group of objects contains sub-groupings that share a common ‘hidden’ periodicity. The problem is to identify their underlying connection.

[very loosely] The numbers 4, 8, 12, 16, 20 … are a ‘hidden’ subgroup within the list of all integers. If I know my four times table I have discovered the secret that connects them.

What’s particularly useful, is that the same problem exists by analogy for different types of objects (formalized as mathematical groups):

Abelian Group – Integer factorisation and the discrete logarithm problem are at the heart of conventional public-key cryptography (e.g. RSA and ECDH). Each can be reduced to solving HSP for the finite Abelian group. Shor’s algorithm shows us how to solve this case efficiently using a quantum computer .

Symmetric Group – The long standing computer science classic problem of graph isomorphism reduces to solving HSP for the symmetric group. Given the success of Shor’s algorithm for the Abelian case, significant effort has gone into trying those techniques here. No efficient quantum algorithm has been made to work. However graph isomorphism has more recently been ‘solved classically’ by László Babai who has shown a semi-efficient (quasi-polynomial time) solution [ ].

Dihedral Group – Lattice-based cryptography is a leading contender for PQC protocols.  In general it relies on the hardness of the shortest vector problem. This reduces to solving HSP for the dihedral group. No efficient algorithm, quantum or conventional, is known buts it’s just as young a field as quantum computing itself [ ].

Fact Based Insight suggests the following conclusions to informed commercial observers:

  • A complete breakdown of the P ≠ NP conjecture is only for the extreme impact/very low probability corner of the risk management grid.
  • A general breakdown of the trapdoor assumption is possible without losing one-way functions overall. Public-key encryption is therefore more exposed to this systemic risk than are public-key signatures. Most would say such a systemic failure was possible but not likely.
  • Any one family of PQC protocols could simultaneously fail (without warning) due to a newly found attack on its underlying problem class. Many would think that over an extended period of time it is indeed quite possible that one or more families will be broken in this way. It is much less likely that all families would fail at the same time.
  • The additional structure included in practice in all PQC protocols will cause some individual protocols to fail. It’s quite probable that a few more will fail before the completion of NIST round 3. It’s quite possible that more will fail in the medium term. The most streamlined PQC protocols typically have the most additional structure and so are more at risk.

Assumptions by use case

It’s pertinent to look again at the crypto use-case we discussed earlier in light of what we have learned (such analysis is not new [ ]).

1. Current Internet security – typical case

RSA 2048 + ECDH 256 + AES 128

Broken: data intercepted and stored today is already vulnerable.

2. Future all-maths based ‘streamlined’ PQC for the Internet

PQC DS + PQC KEM (c.1100 bytes) + AES 256

Depends on enduring hardness of trapdoors and additional structure.

3. Future all-maths based high security PQC for critical applications

Strong PQC DS + Strong PQC KEM (c.10,000 bytes) + AES 256

Depends on enduring hardness of trapdoors and a minimum of additional structure.

4. Fully in-band enhanced security using hybrid PQC/QKD

Strong PQC DS + QKD + AES 256

Depends on the enduring security of one-way functions.

5. Fully in-band ‘conditional’ perfect security

Strong PQC DS + QKD + OTP

Depends on the real-time security of one-way functions.

6. Information-theoretic security

Private-Key DS + QKD + OTP

‘Almost’ information-theoretically secure.

How quickly the near-term capabilities of practical quantum computers will develop has been the subject of much speculation, but little certainty.  When might a quantum computer large enough to compromise current security be built? Where businesses need a single ‘reasonable worst case’ date Fact Based Insight continues to suggest 2027 (though it’s important to emphasise that the ‘most likely’ date is perhaps 2035 or beyond).

Fact Based Insight believes that PQC will meet the majority of users’ medium term needs via 2 and 3.  Some users will find it prudent to assume that in the long-term (perhaps a rolling 10 years horizon) that any single scheme of public-key cryptography will be broken. QKD seems to offer a significantly enhanced and complementary security promise via 4 where this is desired.

In the very long term (50-100 yrs+), Fact Based Insight believes it’s prudent to assume any scheme of computational security will eventually be out-gunned by some new innovation (quantum or otherwise). Most users will simply migrate again to new arrangements in good time. However if the confidentiality of your data simply must endure, or you simply seek the maximum security promise available now then specialist applications exist for 5 and 6. Though, ‘out-of-band’ solutions can provide stiff competition.

So far we have set aside reservations about the practical implementation of QKD. But how realistic is that?

QKD and its demons

The original 2016 NCSC whitepaper on Quantum Key Distribution [ ])  highlighted the real-world vulnerabilities in the physical devices then on offer, the practical limitations on their deployment and the high costs of the specialist hardware required. Fact Based Insight believes that this was largely fair at the time. But how have things moved on in four years?

Certification

All real-world crypto systems have to worry about practical hardware vulnerabilities

Side-channel attacks: attacks on their physical hardware implementation rather than the security algorithm itself. Sensitive lab demonstrators and early commercial QKD prototypes proved easily vulnerable to a variety of malicious techniques: beam splitting attacks, detector control attacks, laser damage attacks, Trojan horse attacks and others [ ].

The typical approach for managing against threats of this type is that an international body lays down overall standards, governments/national agencies prescribe which standards must be met for sensitive applications, and independent accredited labs certify hardware against the relevant standards.

Over the last three years the nascent QKD industry has responded to this challenge. Hardware has improved. Standard setting activity is underway through ETSI, the ISO and ITU-T [ ]

The recent BQIT:20 conference [ ] was an opportunity to review some of the latest results in testing present day QKD equipment.  Vadim Makarov (Quantum Hacking group leader at the Russian Quantum Center) presented his experience from applying the accreditation framework he has developed to four contemporary systems, including examples of those from leading vendors IDQ and QuantumCTek. While it is clear that no system yet gets a completely clean bill of health from Makarov, the process has led to striking progress in closing historic vulnerabilities [ ].

The standard approach proposed by most practical proponents of QKD is that it should be used in parallel with a PQC public-key algorithm. Such ‘hybrid keys’ combine the benefits of ‘computational hardness’ and ‘laws of physics’ security promises. In practical systems these aim to form independent and therefore complementary layers of security. If this is combined this with one of our battle-tested present-day protocols we achieve a ‘triple lock’ on security.

Practical limitations

QKD technologies suffer from significant practical deployment constraints. Early prototype systems required dedicated point-to-point optical fibre connections. They offered very low secret key rate and very limited connection distances. Again progress has been marked on all of these dimensions.

Commercial offerings are maturing. QuantumCTek’s QKD-POL40 (15 kbps @ 10 dB, 1 kbps @ 17 dB) is used in metropolitan-area QKD networks. The 2000 km Beijing-Jian-Hefei-Shanghai QKD backbone is thought to use the QKD-POL1250-S for long distance links of up to 89km between trusted nodes (experts speculate that throughput of 38 kbs on the longest hop is achieved using two machines in parallel, each at 19 kps and assuming optical fibre transmission losses of perhaps 0.2 db/km).

Early mover Swiss-based IDQ’s offers the Cerberis3 (1.4 kbps@12 dB). SK Telecom has taken a strategic stake in IDQ and has deployed its QKD kit on the major Seoul-Daejeon section of its backbone network.

In the US, Quantum Xchange is offering the Phio QK service based on IDQ QKD kit and its own point-to-multipoint trusted node technology (it also offers Phio TX an out-of-band key delivery solution). MagiQ offers its own QPN hardware.  Qubitekk recently purchased the QKD patent portfolio of QinetiQ.

QuintessenceLabs are about to launch qOptica, a second generation product based on CV QKD technology (developed partly with funding from the Australian Dept. of Defense).

The UK Quantum Network has been used since 2018 to demonstrate the sustained operation of ‘metropolitan’ QKD networks in Cambridge and Bristol. In 2019 this was extended with long-distance links, including to the BT-led tech cluster at Adastral Park. QKD was also demonstrated for edge network applications on the UK 5G test-bed in Bristol.

In Spain, Telefónica demonstrated CV QKD in 2018 in partnership with UPM and Huawei. With Deutche Telekom, Orange, BT it is now taking part in the EU’s OPENQKD continent-wide test-bed initiative. This project also brings in leading conventional network equipment manufacturers such as ADVA, Mellanox, Rohde & Schwarz and Nokia.

BQIT:20 included updates on leading developments from across quantum communications:

Mariella Minder (Toshiba Europe) presented their latest results with TF QKD. This second generation QKD technology promises to significantly enhance the practical range/secure key rates for QKD [ ]. (The promise of this technique is reflected by the energy leading Chinese QKD groups are separately putting in to its development).

Siddarth Joshi (Univ. of Bristol) presented the realisation of an eight user quantum network offering full entanglement between any pair of users [ ]. This is a capability far beyond the basic requirements of first generation QKD. (Only a few years ago this would have been considered an experiment to test quantum fundamentals. It was remarkable to hear it presented and solved as a network engineering challenge).

Daniel Oi (Univ. of Strathclyde) reviewed the significant activity underway to commercialise satellite QKD systems. China remains the only country to have demonstrated a full suite of QKD technologies from space. However CubeSat/micro-satellites look like the most likely path to make such technology viable at scale. Over seven planned development missions in this format are in the public domain. Oi described how the forthcoming QKD CubeSat and a following UK Quantum Comms Hub mission will test a complete satellite/compact ground station combination [ ]. This work build on collaboration with Singapore’s CQT, including the recent SpooQy-1 component testing mission.

Several presenters discussed progress on quantum memory technologies. In addition to applications in quantum computing, these promise to one-day allow the genuine quantum repeaters to remove QKD’s dependence on trusted nodes. A highlight was Ralf Riedinger (Harvard) presenting an experimental realisation of memory enhanced QKD based on a silicon-vacancy nano-diamond device [ ]. The diamond memory allows the secure relay of the quantum state, allowing second generation MDI-QKD to be performed with a factor 78 improvement in the raw key rate compared to a direct connection.

(BQIT:20 was another great example of a quantum technology conference reinventing itself online during the Coronavirus crisis to spectacular effect.)

Cost

Early QKD equipment has historically been very costly, with simple two-node installations commanding six figure sums. Commercial players now have significant scope to move costs down as the market broadens.

The AQuaSeC consortium, including Toshiba and KETS, is targeting the realisation of a single chip solution able to simultaneously support existing ECDH, PQC and QKD solutions [ ]. The EU’s UNIQORN consortium has similar plans [ ]. Such solutions promise radically lower manufacturing costs. Fact Based Insight expects a factor of 10 reduction in end-user prices in the next 3-5 years.

Fact Based Insight believes that potential early QKD adopters certainly need to be careful in their procurement process. In the absence of actual certification, at least the path and timescale for certification needs to be considered. The deployment constraints and likely future upgrade path available need to be understood.

Ultimately the business case must be considered alongside alternatives. If additional money is to be spent on QKD, might those funds be more effectively spent on conventional security improvements? Only a survey of the user’s own situation can answer that question.

PQC and its demons

From its inception PQC has also wrestled with its own challenges. Here also progress is on track.

Resource requirements

One of the reasons we use the cryptographic protocols we do today is because they offer relatively modest key sizes, cipher-text sizes, digital signature sizes and processing overheads. Equally importantly the chaotic hardware environment that is the modern internet has grown-up with a Darwinian respect for these requirements.

Many current sites might use ECDH with a 256 bit key. PQC KEM requires key sizes typically in a range from 1000 to 10,000 bytes (80,000 bits) [ ]. Some have more modest key sizes, but incur much higher processing overheads. The Classic McEliece protocol is exceptionally well tested, but requires key sizes of 262 kilobytes to be post-quantum secure!

PQC Trials – Google recently ran trials of two PQC KEMs implementing the Internet TLS security layer. NTRU-HRSS (1100 byte key) and SIKE (330 byte key) were chosen from two different protocol families offering contrasting size/performance trade-offs. In both cases the new algorithm was also combined with an existing ECDH key to mirror the dual-lock requirement likely in realistic transition strategies [ ].

The encouraging results obtained are a proof in principal that these approaches can work. NTRU-HRSS performed better in the majority of environments tested, though in specific cases SIKE won-out. Previous rounds of testing have struggled to get the larger key sizes (c. 10,000 bytes) associated with unstructured lattices to process acceptably. This is a confirmation of the intuition that different PQC algorithms may find different niches.

One of the reasons many see ‘structured’ lattice-based protocols as leading contenders in the NIST process is that they offer a relatively attractive trade-off between the competing requirements of size and speed. The transition process for the roll-out of such technology on the Internet or on a larger corporate network would still no doubt be long and challenging in practice.

TLS 1.x – The Internet has already been through waves of security upgrade, from SSL to TLS and multiple versions of the latter. TLS 1.2 was standardised in 2008 but not widely deployed by browsers until 2013-14. The process of upgrade from TLS 1.2 to TLS 1.3 has been painful and is ongoing. Adam Langley (Google) and Nick Sullivan (Cloudflare) have argued for the lessons from this process to be built-in to how we standardise and roll-out PQC protocols [ ].

Patents

The NIST PQC process has deliberately not imposed constraints due to patent or IP rights considerations upfront. IP claims made by groups participating in the process are monitored, but have not so far been a prime consideration for shortlisting.

NIST vs Patents – Eight of the continuing candidates are known to have patents associated with some part of their implementation. In addition, one patent owned by the French government (EP 2537284, priority date 18 Feb 2010) appears to potentially affect a range of code-based and lattice-based schemes with smaller key sizes [ ].

General business observers are often familiar with patents as a business tool and can be surprised by the nature of the debate within the hard-core Internet community. It is important to realise the role that open-access has played in the growth and success of the Internet. Many members of the community view Internet freedom as a new basic right to be protected.

Fact Based Insight expects IP considerations to become increasingly important in the third round of the NIST process. Big money may be at stake, but we don’t expect a final result that materially impacts the essential open-access character of the Internet.

Backdoors

Backdoors are a concealed mechanism within a protocol that allows a third party such as a government, spy agency or hacker, to bypass its normal security. Some worry that new cryptographic protocols could contain secret backdoors.

Dual_EC_DRBG  – In 2006 NIST promoted this standard for pseudo-random number generation. Following the memos leaked by Edward Snowden many see strong circumstantial evidence that the NSA deliberately introduced a backdoor into this algorithm [ ].

Having interacted with some of the figures leading the NIST PQC initiative Fact Based Insight finds it difficult to believe that they are secretly plotting to subvert the world’s cryptography. More importantly they have designed a transparent international process that would make it maximally difficult to achieve any such subterfuge. It’s not clear what more we could ask of them.

Fact Based Insight believes that the danger of unauthorised backdoors remains a key reason why business adopters should obtain their cryptographic implementations from reputable vendors using algorithmic libraries certified against reputable standards.

PQC early movers

A variety of companies have undertaken real world trials or introduced pilot products based on algorithms still under consideration in the NIST process.

Tech majors are actively involved with particular submissions to NIST such as Microsoft (FrodoKEM, SIKE, Picnic, qTESLA) and IBM (CRYSTALS-Kyber, CRYSTALS-Dilithium). Specialist startups are also active such as Post-Quantum (NTS-KEM), evolutionQ (SIKE) and others.

Amazon has announced support for BIKE and SIKE for connections to AWS. IBM Key Protect supports CRYSTALS-Dilithium, Falcon, SPHINCS+ and Picnic. Google have tested NTRU-HRSS and SIKE for Internet TLS. Due to the experimental nature of these algorithms, in all cases the implementations provide a dual-lock by combining the PQC algorithm with a proven existing algorithm.

In an area forgotten by many casual users, IBM has already developed updated quantum-safe firmware for their enterprise class tape drive (based on their CRYSTALS suite). At the other end of the spectrum, Utimaco have launch a quantum-safe hardware security module aimed at the IoT market. This should be a reminder to the board room of just how diverse the upgrade challenge is for large enterprises.

Notable PQC solution specialists include PQSecure, PQShield, CryptoNext and ISARA. All emphasise the growing importance of crypto-agility. ISARA have built a portfolio of relevant patents and have been actively partnering with digital certificate and IoT companies to embed PQC-ready solutions.

Quantum safe security specialist evolutionQ is particularly notable as one of the few able to offer expertise in both PQC and QKD technologies.

Random numbers

All of the encryption approaches so far discussed implicitly consume random numbers in prodigious quantities. Every time we want a new encryption key we need a new random number. Before we encrypt real messages we pad them to a standard size with random nonce values. When we store passwords we protect their secrecy with random salt values. One-time-pads are just big lists of random numbers.

Cryptographically secure pseudo-random number generators (CSPRNG) are one part of conventional cryptographic arsenal.  Cryptographers like them because their properties can be mathematically described (their security is ultimately based on that of an underlying one-way function). However even here, for secure operation they must still be initialised by a random seed value (and not contain a hidden back door).

High security applications can use hardware based true random number generators (TRNG). Commercial devices typically rely on sampling noise or a free running oscillatory circuit.  A range of standards have been established that have eased the acceptance of such devices within the cryptographic community. Some designs claim random number production rates of up to about 150 Mbps; simpler designs only a few hundred kbps.

More recently, many quantum technology startups have been extending this concept to offer devices based directly on quantum effects, creating a type of TRNG now referred to as a QRNG.

The NCSC whitepaper on quantum security technologies [ ] also discusses QRNG devices.

Quantum randomness

Under the laws of classical physics events might appear random but they are always ultimately deterministic. Quantum mechanics has randomness built into its foundations.

Einstein vs Bohr – It’s pertinent to recall that many physicists never liked this new-fangled quantum randomness, and spent 50 years trying to get away from it; “God doesn’t play dice” as Einstein famously remarked. Remarkably, physicist John Bell devised a statistical test that has allowed us to experimentally settle the question. Local realism lost [ ]. Quantum randomness is here to stay.

Bell test – A statistical inequality initial conceived as a test of quantum theory. Similar statistical techniques can now be used to verify that genuine quantum randomness is occurring in a practical device.

QRNG devices claim ‘quantumness’ as a marketing advantage over other TRNG devices. They can also point to how this is driving practical benefits.

By directly harnessing quantum processes, streamlined designs can generate random numbers at high rates. Early players have established products such as QuintessenceLab’s qStream (1 Gbps), IDQ’s Quantis cards (240 Mbps) and chips (5 Mbps) and QuantumCTek’s QRNG-PHF100 (100 Mbps-10 Gbps).

A second generation of devices, such as CQC’s Ironbridge, emphasise the design features of their architecture that will allow the quantum nature of their output to be directly certified. CQC have recently demonstrated the integration of Ironbridge with IBM’s Key Protect cloud service for PQC key generation and management. Qrypt have built a crypto-friendly entropy-as-a-service offering on top of a variety of QRNG technologies, including those from Quside and ORNL.

KETS are launching a range of devices that aim to compete at the top-end on speed, and at different price points, while offering superior convenience of integration for PQC applications. Bra-Ket are also thought to be planning to launch a device.

Fact Based Insight believes the NCSC is right to point to the need for better understanding of QRNG devices in real-world usage and under attack from an adversary. Again standardisation and certification will be vital. Hardware vendors are already vying to differentiate their products with a blizzard of standards.

  • NIST SP800-22 – Statistical test suite for random and pseudorandom number generators for cryptographic applications
  • NIST SP800-90A/B/C (draft) – Standards for random bit generators
  • NIST FIPS 140-2/3 – Security requirements for cryptographic modules
  • METAS DIEHARD – Battery of tests for random number generators
  • BSI AIS-31 – Functionality classes and evaluation methodology for physical random number generators
  • AEC-Q100 – Failure mechanism based stress test qualification for integrated circuits
  • iTech Labs – RNG testing and certification for online gaming systems
  • GM/T 0005 – Randomness test specification
  • OASIS KMIP 1.0 – 2.0 – Key Management Interoperability Protocol Specification
  • CC EAL 1-7 – Information technology security evaluation

New quantum-specific standards will be required to correctly capture the potential of this new technology. Fact Based Insight sees two in particular as important to watch:

  • IEEE PAR 7130 (project) – Standard for Quantum Technology Definitions (project)
  • ITU-T X.1702 – Quantum noise random number generator architecture

Certified public randomness

A different form of randomness may also be on offer in the near future. Google have pointed to the provision of provably random numbers (aka. certified random numbers) as a potential early application for near-term quantum devices.

Google Sycamore would be a ludicrously expensive QRNG, but that isn’t the point. Google have acquired (non-exclusive) rights to a protocol being developed by Scott Aaronson [ ]. This generates random numbers via an open challenge and response process, the results of which can be publicly verified. The claim is that this can be done remotely and openly without the possibility of cheating. At the moment if I want to use a public randomness beacon I have to trust its operator which significantly limits potential applications. If proven, this new method would provide a much stronger, new cryptographic resource.

A much bigger landscape

Since NCSC issued its initial advice on quantum safe cryptography in 2016 , much has moved on. Google has demonstrated quantum supremacy and numerous groups around the world are racing to build large scale quantum computers. While this threat marches closer to reality, the response of the PQC community remains very much on track.

However, perhaps an even more significant change has occurred in the global visibility of quantum communications technology and its potential wider future role.

China grabbed the attention of the world in 2017 by demonstrating space based QKD from its Micius satellite, and by building its 2000km Beijing-Jian-Hefei-Shanghai QKD backbone network. It has become the undisputed leader in practical QKD deployment. But its leading figures such as Jian-Wei Pan (USTC) have always emphasised a much wider vision for fully entangled quantum networking [ ].

In 2018 the EU launched its Quantum Flagship. It has subsequently launched the complementary OPENQKD and QCI initiatives to study and promote the operational deployment of QKD in Europe. The Quantum Flagship recently published its Strategic Research Agenda [ ].  It now places ‘building the Quantum Internet in Europe’ as its long term 20-25 year unifying goal.

In 2019 the US passed the NQIA to promote a full range of Quantum Technologies. Recently in 2020 the White House National Quantum Coordinating Office has published ‘A Strategic Vision for America’s Quantum Networks’ [ ]. This identifies the creation of the Quantum Internet as an economic opportunity of central importance. It points to the wider potential of quantum networking to enable and bring together secure communication, sensing and enhanced quantum computation opportunities. It specifically sets the goals for the development of quantum networks over five and twenty years.

It is important to realise that this emphasis on networks is not just hype inspired by the story of the Internet. It’s based on insights into the wider nature of quantum technology. It is not digital. When we network conventional computers their power scales linearly. When we coherently network quantum computers their power can scale exponentially [ ]. When we network conventional sensors their fidelity scales only as √n. When we coherently network quantum sensors their fidelity can scale as n [ ]. When we outsource sensitive business calculations to the cloud we must trust our service provider. We can use quantum links to perform ‘blind’ computing with almost zero overheads [ ]. Such networks would come with QRNG and QKD build-in.

Launched in 2014, the UK NQTP can point to these international developments as reflecting much of its own original conception [ ] (a template that others are now clearly seeking to copy around the world). As one of the four pillars of the programme, the UK Quantum Communications hub adopted creating a UK Quantum Network as its own initial goal.  Work on first and second generation QKD systems has been seen as a stepping stone on a journey to build commercial strength in quantum communication technologies in the UK.

The genius of the UK programme has been to see this as not just an investment in applied technology. It’s been a focus on creating a vibrant ecosystem of universities, facilities, startups and industry investors learning the habit of working together. Developing short term opportunities and synergies across the quantum sector has been a key tool. The UK government via BEIS and agencies such as EPSRC and InnovateUK have worked to make this happen. The fruits of this labour have been clearly on display at the annual quantum showcase and at BQIT: the UK Quantum Network, world leading work by Toshiba Europe based in Cambridge, a research presence from industry leader IDQ in Bristol, active participation from communication industry leaders such as BT, a vibrant CubeSat sector, and notable startups such as KETS Quantum Security.

Fact Based Insight would readily agree that the NCSC should never promote a security technology that it believes to be unsound. We also share the view that PQC is an amazing technology in its own right and has a leading role to play into the future. However we do think the tone of NCSC’s views on QKD are misplaced on this occasion. They threaten to leave UK tax payers wondering why different wings of their government seem to be at odds. Quantum sector investors will continue to see unique strengths in UK research institutes, but they may pause to consider the best medium-term locations for their quantum networking investments.

Actions for Business

Businesses must plan ahead on how they will respond to this coming major cybersecurity transition. In some cases immediate action is required.

  • If you have classes of data that must remain confidential beyond 2027, you need to consider measures that protect against data being intercepted and stored today for later decrypt when a sufficiently large quantum computer becomes available.
  • If you have investment programmes in assets or products with long expected life-cycles, you need to ensure that they have post-quantum robust upgrade paths. Remember that authentication protocols are normally at the heart of secure upgrades, but it’s those protocols that need to change.
  • For most common applications, businesses should be happy to observe the NIST PQC process and be ready to adopt solutions from the suite of protocols it will approve. Remember though that rolling out changes across a large organisation can takes years and appropriately skilled resources will be in increasing demand. Be ready to move promptly to avoid being caught out. Make sure your internal security team and network of vendors is building preparatory skills now.
  • For businesses with stronger security requirements, you should be working with your security vendors to review the portfolio of options and transition paths appropriate to you. Leverage modest amounts of money spent with innovators and startups. Seek opportunities to turn an early response into a differentiating factor for your brand.
  • For businesses with the strongest security requirements, becoming early adopters of the techniques offered by quantum cryptography should be considered an option as an additional layer of security. In particular QKD has a unique security promise for long-term (10 yrs+) and very long-term (50-100 yrs+) data security. When properly implemented, this remains true even when it is combined, for flexibility, with real-time maths-based authentication.
  • Any early adopter of new security technology should pay particular attention to three factors: What certification against relevant standards has the hardware/algorithms I am installing actually achieved? What upgrade paths will realistically be available? What is my fall-back level of security and action plan if a vulnerability is suddenly uncovered?
  • The business case for early adoption should be carefully considered. What could be achieved by spending the same money on conventional security strengthening measures?

Investors should recognise the growing attractiveness of the quantum communications sector within the wider quantum technologies landscape.

  • Building the Quantum Internet promises to be an epic long-term journey, requiring the development of several generations of new technology. Crucially there are set to be many opportunities for commercialisation along the way.
  • Specialist photonic components, QRNG and QKD are all useful early opportunities for investors wanting to see a healthy mix of near-term revenue opportunities within their portfolios. Watch-out for the second and third generation technologies disrupting current leaders.
  • Quantum memory technologies are now also beginning to emerge. These are central to building the true quantum repeaters needed to extend the range and power of QKD and of true quantum networking. They promise to be a key crossover technology between quantum communications and quantum computers.
  • Photonics is a central discipline in quantum communications. Look for R&D locations that are close to academic centres of photonics expertise. Look for national quantum programmes able to provide real support in prototype development.
  • Proactive involvement with standards authorities is set to be crucial. This is a time consuming activity for staff in large organisations and an almost impossible one for key staff in startups. Look to locations that provide consistent governmental support for these activities.

References

[1]
M. K. Bhaskar et al., “Experimental demonstration of memory-enhanced quantum communication,” Nature, vol. 580, no. 7801, pp. 60–64, 2020, doi: 10.1038/s41586-020-2103-5. [Online]. Available: http://arxiv.org/abs/1909.01323. [Accessed: 04-May-2020]
[1]
B. Hensen et al., “Experimental loophole-free violation of a Bell inequality using entangled electron spins separated by 1.3 km,” arXiv:1508.05949 [quant-ph], 2015, doi: 10.4121/uuid:6e19e9b2-4a2d-40b5-8dd3-a660bf3c0a31. [Online]. Available: http://arxiv.org/abs/1508.05949. [Accessed: 11-May-2020]
[1]
M. Minder et al., “Experimental quantum key distribution beyond the repeaterless secret key capacity,” Nat. Photonics, vol. 13, no. 5, pp. 334–338, 2019, doi: 10.1038/s41566-019-0377-7. [Online]. Available: http://arxiv.org/abs/1910.01951. [Accessed: 11-May-2020]
[1]
Quantum Flagship SAB, “New Strategic Research Agenda on Quantum technologies,” Shaping Europe’s digital future - European Commission, 03-Mar-2020. [Online]. Available: https://ec.europa.eu/digital-single-market/en/news/new-strategic-research-agenda-quantum-technologies. [Accessed: 11-May-2020]
[1]
White House NQCO, “A Strategic Vision for America’s Quantum Networks.” Feb-2020 [Online]. Available: https://www.whitehouse.gov/wp-content/uploads/2017/12/A-Strategic-Vision-for-Americas-Quantum-Networks-Feb-2020.pdf
[1]
EU Commission, “UNIQORN | Affordable quantum communication for everyone,” Shaping Europe’s digital future - European Commission, 29-Oct-2018. [Online]. Available: https://ec.europa.eu/digital-single-market/en/content/uniqorn-affordable-quantum-communication-everyone. [Accessed: 11-May-2020]
[1]
UK NCSC, “Quantum key distribution.” [Online]. Available: https://www.ncsc.gov.uk/whitepaper/quantum-key-distribution. [Accessed: 11-May-2020]
[1]
UK NCSC, “Quantum security technologies.” [Online]. Available: https://www.ncsc.gov.uk/whitepaper/quantum-security-technologies. [Accessed: 11-May-2020]
[1]
ETSI, “ETSI - ETSI/IQC Quantum Safe Cryptography Workshop 2019,” ETSI. [Online]. Available: https://www.etsi.org/events/1607-etsi-iqc-quantum-safe-cryptography-workshop-2019. [Accessed: 11-May-2020]
[1]
D. J. Bernstein, “Cr.yp.to - Fighting Patents.” [Online]. Available: https://cr.yp.to/patents.html. [Accessed: 11-May-2020]
[1]
S. Aaronson, “Lecture 16, Private-Key Cryptography,” p. 4.
[1]
S. Aaronson, “P=?NP,” 004, 2017 [Online]. Available: https://www.scottaaronson.com/papers/pnp.pdf. [Accessed: 11-May-2020]
[1]
S. Aaronson, “Opinion | Why Google’s Quantum Supremacy Milestone Matters,” The New York Times, 30-Oct-2019 [Online]. Available: https://www.nytimes.com/2019/10/30/opinion/google-quantum-computer-sycamore.html. [Accessed: 11-May-2020]
[1]
I. T. L. Computer Security Division, “NIST Briefs Committee of Visitors - Crypto Standards Development Process | CSRC,” CSRC | NIST, 24-May-2016. [Online]. Available: https://csrc.nist.gov/Projects/Crypto-Standards-Development-Process/NIST-Briefs-Committee-of-Visitors. [Accessed: 11-May-2020]
[1]
L. Babai, “Graph Isomorphism in Quasipolynomial Time,” arXiv:1512.03547 [cs, math], Jan. 2016 [Online]. Available: http://arxiv.org/abs/1512.03547. [Accessed: 11-May-2020]
[1]
L. Mazzarella et al., “QUARC: Quantum Research Cubesat—A Constellation for Quantum Communication,” Cryptography, vol. 4, no. 1, p. 7, Mar. 2020, doi: 10.3390/cryptography4010007. [Online]. Available: https://www.mdpi.com/2410-387X/4/1/7. [Accessed: 11-May-2020]
[1]
P. N. Sullivan, “CSRC Presentation: Measuring TLS key exchange with post-quantum KEM | CSRC,” CSRC | NIST, 22-Aug-2019. [Online]. Available: https://csrc.nist.gov/Presentations/2019/measuring-tls-key-exchange-with-post-quantum-kem. [Accessed: 11-May-2020]
[1]
UKRI, “Agile Quantum Safe Communications (AQuaSec).” [Online]. Available: https://gtr.ukri.org/projects?ref=104615. [Accessed: 11-May-2020]
[1]
U. of Bristol, “BQIT:20 | School of Physics | University of Bristol.” [Online]. Available: http://www.bristol.ac.uk/physics/research/quantum/conferences/bqit-workshop/. [Accessed: 11-May-2020]
[1]
ITU, “ITU Workshop on Quantum Information Technology (QIT) for Networks.” [Online]. Available: https://www.itu.int/en/ITU-T/Workshops-and-Seminars/2019060507/Pages/default.aspx. [Accessed: 11-May-2020]
[1]
P. D. Moody, “CSRC Presentation: The 2nd Round of the NIST PQC Standardization Process | CSRC,” CSRC | NIST, 22-Aug-2019. [Online]. Available: https://csrc.nist.gov/Presentations/2019/the-2nd-round-of-the-nist-pqc-standardization-proc. [Accessed: 11-May-2020]
[1]
A. Broadbent, J. Fitzsimons, and E. Kashefi, “Universal blind quantum computation,” 2009 50th Annual IEEE Symposium on Foundations of Computer Science, pp. 517–526, 2009, doi: 10.1109/FOCS.2009.36. [Online]. Available: http://arxiv.org/abs/0807.4154. [Accessed: 11-May-2020]
[1]
ISO, “ISO/IEC WD 23837-1,” ISO. [Online]. Available: https://www.iso.org/cms/render/live/en/sites/isoorg/contents/data/standard/07/70/77097.html. [Accessed: 11-May-2020]
[1]
L. Babai, “Graph Isomorphism update, January 9, 2017.” [Online]. Available: http://people.cs.uchicago.edu/~laci/update.html. [Accessed: 11-May-2020]
[1]
M. N. Wegman and J. L. Carter, “New hash functions and their use in authentication and set equality,” Journal of Computer and System Sciences, vol. 22, no. 3, pp. 265–279, Jun. 1981, doi: 10.1016/0022-0000(81)90033-7. [Online]. Available: http://www.sciencedirect.com/science/article/pii/0022000081900337. [Accessed: 11-May-2020]
[1]
I. T. L. Computer Security Division, “Post-Quantum Cryptography | CSRC,” CSRC | NIST, 03-Jan-2017. [Online]. Available: https://csrc.nist.gov/projects/post-quantum-cryptography. [Accessed: 11-May-2020]
[1]
Q. Zhang, F. Xu, L. Li, N.-L. Liu, and J.-W. Pan, “Quantum information research in China,” Quantum Sci. Technol., vol. 4, no. 4, p. 040503, Nov. 2019, doi: 10.1088/2058-9565/ab4bea. [Online]. Available: https://doi.org/10.1088%2F2058-9565%2Fab4bea. [Accessed: 11-May-2020]
[1]
Q. Zhuang, J. Preskill, and L. Jiang, “Distributed quantum sensing enhanced by continuous-variable error correction,” New J. Phys., vol. 22, no. 2, p. 022001, Feb. 2020, doi: 10.1088/1367-2630/ab7257. [Online]. Available: http://arxiv.org/abs/1910.14156. [Accessed: 11-May-2020]
[1]
P. Knight and I. Walmsley, “UK national quantum technology programme,” Quantum Sci. Technol., vol. 4, no. 4, p. 040502, Oct. 2019, doi: 10.1088/2058-9565/ab4346. [Online]. Available: https://doi.org/10.1088%2F2058-9565%2Fab4346. [Accessed: 11-May-2020]
[1]
S. Sajeed et al., “An approach for security evaluation and certification of a complete quantum communication system,” arXiv:1909.07898 [quant-ph], Mar. 2020 [Online]. Available: http://arxiv.org/abs/1909.07898. [Accessed: 04-May-2020]
[1]
S. Aaronson and A. Ambainis, “The Need for Structure in Quantum Speedups,” arXiv:0911.0996 [quant-ph], Feb. 2014 [Online]. Available: http://arxiv.org/abs/0911.0996. [Accessed: 11-May-2020]
[1]
L. Lydersen, C. Wiechers, C. Wittmann, D. Elser, J. Skaar, and V. Makarov, “Hacking commercial quantum cryptography systems by tailored bright illumination,” Nature Photon, vol. 4, no. 10, pp. 686–689, 2010, doi: 10.1038/NPHOTON.2010.214. [Online]. Available: http://arxiv.org/abs/1008.4593. [Accessed: 04-May-2020]
[1]
J. Rompel, “One-way functions are necessary and sufficient for secure signatures,” in Proceedings of the twenty-second annual ACM symposium on Theory of computing  - STOC ’90, Baltimore, Maryland, United States, 1990, pp. 387–394, doi: 10.1145/100216.100269 [Online]. Available: http://portal.acm.org/citation.cfm?doid=100216.100269. [Accessed: 03-May-2020]
[1]
D. Stebila, M. Mosca, and N. Lütkenhaus, “The Case for Quantum Key Distribution,” arXiv:0902.2839 [quant-ph], vol. 36, pp. 283–296, 2010, doi: 10.1007/978-3-642-11731-2_35. [Online]. Available: http://arxiv.org/abs/0902.2839. [Accessed: 30-Apr-2020]
[1]
S. Goldwasser and S. Micali, “Probabilistic encryption,” Journal of Computer and System Sciences, vol. 28, no. 2, pp. 270–299, Apr. 1984, doi: 10.1016/0022-0000(84)90070-9. [Online]. Available: http://www.sciencedirect.com/science/article/pii/0022000084900709. [Accessed: 04-May-2020]
[1]
L. M. Ioannou and M. Mosca, “A new spin on quantum cryptography: Avoiding trapdoors and embracing public keys,” arXiv:1109.3235 [quant-ph], Sep. 2011 [Online]. Available: http://arxiv.org/abs/1109.3235. [Accessed: 30-Apr-2020]
[1]
A. Montanaro, “Quantum algorithms: an overview,” npj Quantum Inf, vol. 2, no. 1, p. 15023, 2016, doi: 10.1038/npjqi.2015.23. [Online]. Available: http://arxiv.org/abs/1511.04206. [Accessed: 22-Apr-2020]
David Shaw

About the Author

David Shaw has worked extensively in consulting, market analysis & advisory businesses across a wide range of sectors including Technology, Healthcare, Energy and Financial Services. He has held a number of senior executive roles in public and private companies. David studied Physics at Balliol College, Oxford and has a PhD in Particle Physics from UCL. He is a member of the Institute of Physics. Follow David on Twitter and LinkedIn

1 comment

  1. Pingback: Quantum Internet Outlook 2021 – Fact Based Insight

Leave Comment