Introduction to Quantum Safe Cryptography

Commercial transactions on the Internet and routine cyber security depend on the effectiveness of public key cryptography. Quantum computers will make existing standards obsolete. The impact of this starts now.

The central technical pillar of online security is the use of secure cryptographic algorithms to encrypt data, passwords and identities, and so keep them safe from third parties. In simple symmetric protocols a shared secret key, a long series of binary digits, is used to scramble the data. If I know the key I can unscramble the message. Longer keys mean more security.

Asymmetric or public key cryptography simplifies this in practice. The key used to encrypt messages can be made freely available, but only someone with the associated secret private key can decrypt the messages. The convenience of this approach has made it central to ecommerce on the Internet.

It has become conventional cyber security wisdom that ‘people are the weakest link’. However this is only true because the technical security actually works; the key is capable of ‘locking the door’. The arrival of a threat capable of simply ‘picking the lock at will’ would be very disruptive. Even the fear that this was imminent could be very damaging to public and business confidence in ecommerce.

The security of the current public key schemes most commonly deployed (RSA, EC, DH and DSA) have all been shown to be breakable by a quantum computer running Shor’s algorithm. Internet security protocols that rely on these approaches are therefore vulnerable (TLS, IPsec/IKE, S/MIME, PKI), as well as applications that rely on them (VPN, WPA) (11).

Other existing parts of the cryptographic landscape are less vulnerable. The most common symmetric key encryption used to protect the actual body of messages is quantifiably less vulnerable to quantum attack and long key sizes (AES-256) are probably all that is required to retain security. Secure hash functions used to validate message integrity are also less vulnerable and current hash output lengths are thought to be sufficient for the foreseeable future (SHA-256) (12).

The commercial sensitivity of these issues is increased by the extra responsibilities being placed on organisations by new legislation such as the EU’s General Data Protection Regulation (GDPR). Potential fines extend to 4% of global revenue.

A Current Threat

Estimates vary on when a quantum computer powerful enough to realise this threat will be built. A range of dates from 8 to 20 years have been suggested. More rapid scenarios are also possible: a major programme by a nation state could bring this down to perhaps 5-7 years; alternative schemes based on special-purpose quantum devices might overturn these estimates.

For Fact Based Insight’s most recent review of threat dates read Quantum safe cryptography – waiting to save the world.

Perhaps a business might reasonably assume that they should plan against a viable security threat from quantum computers by 2027 at the earliest. Surely that seems like a comfortable way off?

However there is a problem. If a hostile party intercepts and stores data today, they will be able to break its encryption in 2027. Additionally, the transition programme to put new quantum safe arrangements in place inside an organisation might itself take several years. Remote systems and embedded devices are a further complication. Not only will these require update, but the security protocols routinely used to validate such updates are themselves among those at risk.

The Mosca Inequality: D + T < Q

Data Life – How long must data used today remain confidential even if intercepted and stored for later decrypt by a hostile party?

Transition – How long will it take an organisation to transition its existing operational cyber security arrangements to adopt quantum safe alternatives?

Quantum Computers – The number of years before a device capable of breaking current encryption standards becomes available.

Michele Mosca of Canada’s IQC has suggested that soon companies will be differentiated by the adequacy of their risk management and transition plans against this threat. Assuming you start now and the transition takes 2 years, that’s still only an 8 year safe window. Is the CEO happy to defend to customers and investors that sensitive data just 8 years old might be revealed? (4).

Blockchain technology, popular for its digital currency applications, is not immune from these concerns. Blockchain ledgers typically deal with data where long-term protection is assumed and, due to their distributed nature, transitions to new arrangements may not be trivial. They also depend crucially on perceived market confidence.

Blockchain implementations differ, but typically they use multiple types of cryptography, with separate protocols to ensure the integrity of the ledger, authenticate transactions, ensure privacy, and identify participants. As we shall see, quantum safe solutions exist for all these elements, but particular implementations need to be ready to demonstrate their compliance or transition plans, and to do so clearly, perhaps in the demanding circumstances of a shock to public confidence.

Post Quantum Cryptography

Conventional mathematics based cryptography is not defenceless in the face of quantum computing. Post Quantum Cryptography (PQC) has developed a number of alternative protocols that are thought to be resistant to attack by quantum computers (13).

Quantum Resistant Algorithms – The Contenders

Lattice based – A family of approaches based on the difficulty of solving the Shortest Vector Problem in lattices. Leading candidates go by colourful names such as New Hope and Frodo.

Multivariate based – Linked to the difficulty of solving systems of equations. Many such past proposals have been successfully broken, however conversely this means there is much experience studying these problems. MQDSS is one new candidate. Other variants such as HFEBoost have been employed in field trials with the French Army.

Hash based – Algorithms built on well understood Merkle signatures include LMS & XMSS which offer good performance but require ‘state tracking’ (to avoid reuse); a new alternative approach SPHINCS avoids this drawback but has higher overheads.

Code based – Built on the theory of error correcting codes. A leading example is the long standing but little used McEliece system.

Isogeny based – Supersingular Isogeny Elliptic Curve is a strengthened version of the current popular ECDH key agreement protocol. Small key sizes but currently with high computational overheads.

Since August 2015, the US National Security Agency (NSA) has been calling for an accelerated transition to quantum resistant algorithms. The US National Institute for Standards and Technology (NIST) has initiated a process to solicit, evaluate and standardise suitable replacements for public-key cryptography.

The NIST process will evaluate submissions across planned workshops in April 2018 and August 2019, followed by draft standards in 2020/21 then software industry development and client rollout in the years following (4).

The UK National Cyber Security Centre (NCSC, part of GCHQ) endorses the above approach via NIST and also ETSI (the European Telecommunications Standards Institute). The NCSC has also emphasised the risk of organisations moving too soon to untried and unproven alternative approaches, lest they themselves or a rushed implementation become a security hole.

To counter the risks of introducing a new protocol, most envisage a transition period where both existing and post-quantum algorithms are used in conjunction to produce hybrid keys offering joint security, at the expense of further overhead.

The NIST process is expected to recommend not a single solution, but a palette of options across a whole suite of use-cases including: authentication, key exchange and digital signatures (integrity, non-repudiation).

Computational Complexity – A Millennium Problem

A sceptic might ask “Cryptographers promised previously that the current set of algorithms would take ‘longer than the age of the universe’ to break, why should we believe their new algorithms will be any better?”

However that is not really fair. Simplified promises made their way into headlines, but what the computational theorists were saying was always more nuanced. Modern cryptography’s strengths include offering formal proofs that an algorithm is equivalent to solving a specified hard mathematical problem. This is particularly powerful where the problem is a long known and well-studied one.

P – Problems that can be solved ‘easily’ by a conventional computer (in polynomial time). Colloquially: not good for crypto, as susceptible to a brute force attack.

NP – If you give it a known solution, a computer can ‘easily’ check it (in polynomial time). Colloquially: useful for crypto, if you give me the key I can decode the message efficiently.

NP-complete – the hardest problems in NP. It has been shown that any problem in NP can be transformed into an NP-complete problem. Colloquially: the ideal gold standard for crypto, but difficult to achieve with practical algorithms.

NP-hard – a problem that has been mathematically proven to be equally as hard to solve as the hardest NP problems, but not necessarily itself an NP problem.

P vs NP – it is a widely believed conjecture that P is not simply the same as NP. Thus NP problems can’t easily be cracked by conventional computers. However there is no known mathematical proof of this. Resolving this is one of the famous open Millennium Problems in mathematics.

BQP – Expands P to include problems that can be tackled by a quantum computer. BQP is also conjectured not to include NP-hard problems, but again this is not proven.

Colloquially: we think NP Hard algorithms are secure but can’t prove it. If not it’s time to stock-up on tinned food.

In understanding the promises of PQC attention needs to be paid to the actual basis of the claimed security and its trade-offs in terms of increased key lengths, computation times, or other overheads. At the current state of the art the security promises are strong, but they remain relative not absolute. There is no mathematical proof that even the strongest conventional cryptography algorithms are secure against attack from even a conventional computer.

Most businesses will decide to be led by the NIST process on this, but particularly if they operate in sensitive sectors such as Financial Services, Health or Defence, they should be ready to demonstrate the consistency of their plans.

Quantum Cryptography

Quantum technologies themselves offer a new and complementary approach to secure communications.

Quantum Key Distribution (QKD) uses the quantum properties of a system, typically photons, to allow two separate parties to agree a shared secret key. The quantum nature of the required measurements means that the presence of an eavesdropper can be detected statistically.

A variety of both conceptual and practical protocols have been developed to implement this approach. These offer security ‘guaranteed by the laws of physics’ rather than mathematical algorithm.

Quantum Key Distribution

Alice sends a message to Bob encoded in a series of quantum states. Eve wants to intercept the message, but she can’t do this without disturbing the quantum states and revealing her presence. Alice and Bob use the initial message to agree a secret key, which can then be used for the encryption of subsequent communications. A variety of protocols have been developed built on this concept:

BB84 – Conceived in 1984 by the founding fathers of the field Bennett and Brassard. Alice takes a random number and sends its binary digits encoded as the polarisation of a series of individual photons with separately random axes of polarisation. Bob receives the photons, but has to guess what polarisation axis to measure them on. Alice and Bob then compare just those results where they happened to have used the same polarisation axis and use those to form a shared secret key. If Eve intercepts and measures a photon, she would have to make her own guess for which axis to measure on, leaving statistical evidence of her presence.

Decoy BB84 – An extension of the basic BB84 scheme that allows it to be implemented securely with real-world multi-photon sources.

DI QKD – Device Independent QKD is a conceptual approach that leverages the statistical properties of quantum entanglement to offer a complete guarantee of secure communication. If Eve has tampered with the photon source and measurement devices Alice and Bob can still detect her presence.

MDI QKD – Measurement Device Independent QKD protects against Eve tampering with the measurement device, but simplifies real world implementation by relaxing other constraints of full DI QKD. TF-QKD is an emerging form of MDI-QKD

CV QKD – Continuous Variable QKD measures the phase shift of multi-photon states rather than relying on individual photon measurements. This potentially offers better performance in noisy channels.

It is worth pausing to examine the strength of this physics based security promise. A sceptic might say “But scientists agree their laws are never final, and one day a new breakthrough may allow for QKD systems to be cracked”. Remarkably in this case the promise is stronger than simple reliance on our current understanding of nature.

Bell’s Theorem

The non-deterministic nature of quantum mechanics has from the start led to attempts to formulate alternatives. A popular class of alternative theories were ones that added local hidden variables to the standard theory.

This terminology wasn’t invented with crypto in mind, but such parameters would be the only easy way to add something to standard quantum mechanics that would make QKD messages vulnerable to intercept.

It was proven by John Bell, in 1964 & 1976, that any possible theory based on local hidden variables cannot reproduce the exact statistical predictions of quantum mechanics above a mathematically well-defined limit. An extensive programme of experimental work seeking violations of Bell’s inequality has led to the existence of local hidden variables being ruled out.

Advanced forms of QKD incorporate a dynamic test of Bell’s inequality to verify in situ that un-hackable quantum entangled message exchange is indeed taking place.

Bell’s Theorem puts significant constraints even on future breakthroughs. The debate over the foundations of quantum mechanics continues to rage (10) and one day we may indeed find a new theory, but the most obvious future threats to QKD are already excluded.

QKD Today

Practical QKD has now been widely demonstrated around the world in both lab based and small multi-site installations. It has been shown to work over telecoms-grade fibre, and in combination with conventional data traffic. China has demonstrated a satellite to ground system and is commissioning a major backbone link between Beijing and Shanghai, to connect existing city-based installations. Others have demonstrated the technique with aerial drones, handheld devices & ATMs, chip-scale devices and even pairs of thermal photons (13)(14).

While completely secure communications via QKD is theoretically possible, all real world implementations at the state of the art make trade-offs. In contrast to conventional cryptanalysis, where attacks often use mathematical brute force, here the potential vulnerabilities are in attacks on the physical hardware, so called side-channels. Consequently standard setting and certification processes are essential for wider commercial acceptance of these techniques. ETSI is actively progressing standard setting in this area (4).

Given the promising conventional quantum resistant algorithms of PQC, why might we still want to employ the new quantum approaches of QKD?

  1. For high security usage, security strategy rightly recommends a layered approach. The ‘physics’ basis of QKD means that it is a unique component that complements the ‘maths’ basis of conventional cryptography.
  2. If vulnerability in a PQC algorithm were discovered, a hostile party/government might keep the discovery secret to maximise the time for which it could be exploited.
  3. Securing sensitive data via QKD could become a useful marketing feature. This could be particularly relevant after a shock to public confidence that might emerge during the evolution of practical quantum computing.

Currently the range of QKD is limited in practice to line of sight or about 200km over fibre (double that for MDI QKD). This can be extended via the use of so called Trusted Nodes to relay the key exchange. This has exposed a difference of views. Some view Trusted Nodes as a positive since they build-in the provision for ‘lawful intercept’ of communications. Others view this as an unwelcome intrusion into the normal principles of privacy of internet communications. Small scale quantum computers, called Quantum Repeaters will be able to circumvent the requirement for Trusted Nodes but are not yet available.

Several companies already offer small scale QKD installations on a commercial basis. Some leading business communications service providers are seeking clients for larger proof of concept installations (13).

True Random Numbers

Many techniques in statistical analysis require random numbers as an input, as does almost all cryptography. Lottery and gaming applications are also natural. What we often take as random numbers, are actually pseudo random numbers generated by a software algorithm. The fundamental uncertainty of quantum systems can be captured to provide true random number generation on a chip. ETSI are developing certification procedures to provide transparency for users of these devices.

A true quantum random numbers generator (QRNG) of this type may find many applications, even including in support of security solutions otherwise based on conventional post-quantum cryptography (13).

David Shaw

About the Author

David Shaw has worked extensively in consulting, market analysis & advisory businesses across a wide range of sectors including Technology, Healthcare, Energy and Financial Services. He has held a number of senior executive roles in public and private companies. David studied Physics at Balliol College, Oxford and has a PhD in Particle Physics from UCL. He is a member of the Institute of Physics. Follow David on Twitter and LinkedIn