Quantum hardware – into the quantum jungle

The quantum computing landscape has been transformed by a surge of commercial startups pursuing very different technical paths. Investors and pioneers must prepare themselves for a challenging journey ahead.

A growing list of major, midsized and startup companies have joined the quantum computing quest. However today’s prototype devices must be radically scaled up before they can deliver the benefits promised by FTQC. Many different technology paths are being pursued and there is no consensus on which is best. Some NISQ devices will also seek to pay their way by finding practical commercial applications along the route. No one knows if this will or will not be possible.

Investors and pioneering business adopters can always ask advice from the growing band of physicist-entrepreneurs. A friendly ex-professor will explain the common creed of weird quantum superposition and entanglement, and praise all the great work going on around the world. In private they will explain that they happen to be working on a uniquely promising route through the problems ahead…

The development of quantum computers has been compared to a race or to an attempt to leap across a chasm. Fact Based Insight thinks it can increasingly feel like an expedition into a jungle.

This article seeks to summarise recent developments across the main contending quantum computing platform technologies. It doesn’t seek to match the fine academic reviews available, but rather seeks to bring the main points together in a format accessible to interested investors and early business adopters.

What does it take to build a quantum computer?

A number of national quantum programmes have published analysis of what the expert community thinks it will take to build a quantum computer [ ]. This summer’s clutch of great quantum conferences including BQIT 2020, IQT New York and QT Digital Week has been a great opportunity to review developments against these route maps.

High fidelity qubits

The usual first step is to demonstrate high precision control of one and then two qubits in the lab.

Bloch sphere

Qubits. Unlike a digital bit, a qubits can be in a superposition of 0 and 1 at the same time. When two qubits interact they become entangled. Mathematically the state of a qubit can be described by a position on the surface of the Bloch sphere.

In real world quantum computing we typically seek a two-level quantum system that conveniently embodies these properties. We need the ability to initialise and measure the state of the individual qubits (SPAM). We need qubits with a long conventional state lifetime (T1) and a long quantum coherent lifetime (T2). The system must not ‘leak’ into non qubit states. We need the ability to perform a ‘universal’ set of one-qubit (1Q) and two-qubit (2Q) gates. To support quantum networking we also need to link stationary qubits to ‘flying’ qubits, and we need interconnects that can transmit flying qubits to remote locations [ ].

Strictly speaking, the above describes gate-model quantum computing (sometimes called digital quantum computing). Closely related is measurement based quantum computing (sometimes called one-way quantum computing). Other formalisms include continuous variable quantum computing and adiabatic quantum computing (to which quantum annealing is related). Quantum devices may also be used for direct ‘analogue’ quantum simulation of another system of interest (when a gate-model device simulates such a system algorithmically it’s ‘digital’ quantum simulation). The term analogue quantum computing is often used loosely and different authors can mean any of these latter approaches.

As is common in the sector, this article uses the language of gate-model quantum computing unless noted otherwise.

Demonstrating all qubit gate operations with fidelity above 99% has become a de facto standard for basic technology demonstrations. Beyond this level the most forgiving quantum error correcting code, the surface code, can at least in theory be used to correct errors. In practice much higher fidelities are desirable. For all approaches, the most difficult challenge in practice has proven to be the implementation of high fidelity 2Q gates.

In conventional computing the speed with which gate operations can be performed has often been a central concern. However, quantum computers get their advantage from the algorithms they run not directly from their gate speed. Nor are raw gate speeds as comparable. Higher fidelity may mean that quantum error correction is less onerous. Better connectivity or native gate options may greatly improve efficiency.

However gate speed is still a real world consideration when we compare different quantum platforms that are otherwise equally matched, or potentially if we seek to implement algorithms with more borderline quantum speedups. More subtly, short qubit lifetime can combine with slow gate speed to limit the achievable gate fidelity (the gate fails because the qubit failed).  Fact Based Insight sees the ratio of coherent lifetime/gate speed as often a useful indicator.

Multi-qubit devices – the quantum plane

The second step is to build a larger number of qubits into a single device. Careful engineering and fabrication choices are required.

The material being used to host the sensitive qubits is a key factor. The substrate within or on which the qubits reside often governs the types of disturbance to which they are inherently exposed. In some platforms extreme cryogenic cooling is required to suppress this noise, while for others ultra high vacuum suffices. Some technologies may require neither. Importantly, the sophistication of current mirco/nanofabrication technology often differs widely across the platforms being developed.

Ideally a device would offer high connectivity between qubits, the ability to natively drive a wide variety of different quantum gate options, and flexibility over when to measure the state of individual qubits. Not surprisingly, doing all of this and maintaining the required high fidelity operations turns out to be very challenging.

Quantum Volume (QV) – IBM have championed this measure for those wanting a single figure-of-merit with which to compare different devices [ ]. It captures the challenge of implementing a random set of generic 2Q gate operations between randomly selected qubits. It naturally builds in the degree to which device control and low-level compilation is able to compensate for gate errors and compose gates in terms of the device’s native gate-set and connectivity. By design it gives equal weight to the number of qubits that can be managed and the depth of circuit iterations that can be performed. Crudely speaking it captures the largest ‘square-shaped’ circuit that the device can on average successfully run.

A popular working assumption is that individual devices will scale to about 1000 physical qubits each, and beyond that systems will be created in a modular fashion. Though some emerging technologies assume very different module sizes.

Scaling-up – the control & measurement plane

The systems required to control the quantum system bring their own range of opportunities and challenges, some very different to conventional computing.

Qubits are an analogue superposition of 1 and 0 (and a complex one at that), so qubit operations are typically driven by analogue signals. Each is a potential source of noise, error and crosstalk.  Assuming the quantum and control plan are realised separately, routing the myriad of required connections to and from each is a very significant challenge.

Imagine a typical chip that might be a 2D array of NxN qubits. The number of individual control and readout lines scales up as N2, but the length of the edge of the array scales only as 4N. Conventional electronics often solves this problem by fan-in/fan-out and multiplexing techniques. These are not trivially available for qubit operations.

Given all of these analogue components, the ability to calibrate the system and the reasonable stability of that calibration over time, is a very real challenge and constraint for practical quantum computers. Conversely engineering optimal control techniques can be used to dynamically anticipate and compensate for errors.

Scaling the control systems is also a particular challenge for cryogenic environments. The heating effect of control signals must be compensated. Cooling power is typically greatly restricted at the extreme mK temperatures required by some qubit technologies.

Control Processor Plane and Host Processor

Operating a quantum system also requires significant conventional computing power. This is needed to drive the optimal control routines. It is also central to how quantum error correction is expected to work. In most schemes, a blizzard of measurements are made on supporting ‘ancilla’ qubits. These results have to be decoded in real-time and the necessary corrections fed back into the ongoing quantum calculation. A netowrk of specialist processors are likely required as the machine scales.

Ultimately the quantum computing environment is supervised by a conventional host computer that runs the rest of the software stack required for application development and execution. Early players at this end of the business like to talk about hardware agnostic approaches and their ability to optimally utilise whatever quantum hardware is available. Who well vision can be realised remains to be seen.

Things to do along the way

Before looking in detail at different quantum pioneers, it’s important to look at the subtly different goals they emphasise.

The quest for NISQ Quantum Advantage

A natural question is whether an intermediate scale device can perform a task where it offers a practical advantage over a conventional computer. No one knows for sure if this is or is not possible.

The very highest levels of fidelity are likely to be important. It’s no good having lots of qubits if the calculation collapses after a few cycles because of errors. Specific qubit connectivity that matches the needs of the required calculation is likely to help. Error mitigation schemes are likely to be vital. Hybrid approaches that tightly couple conventional and quantum steps in an iterative algorithm are probably required. Analogue quantum simulators that mirror the physical system of interest could shine.

For more on the quest for NISQ algorithms read Quantum Software – beneath the quantum hype.

Few activities of such scale have been brought to successful fruition without the opportunity to make money along the way. Finding even modest applications is key to encouraging  additional investment in the sector. 

Moore’s Law was never driven by some property of silicon technology, it was all about the effect of compounded investment in R&D. Any technology platform that secures success in the NISQ era will be powerfully positioned to sustain its success to FTQC and beyond. Gallium arsenide is in many ways a superior platform for conventional electronics, but it has never managed to displace silicon due to the latter’s overwhelming advantage in commercial scale.

The wider quantum ecosystem

It’s not just opportunities in quantum computing that make sense, synergies with applications in quantum cryptography or quantum imaging, sensing and time may be equally important. These can support quantum pioneers and the ecosystem of components suppliers and developers on which they depend.

For more on wider opportunities read Beating quantum winter – opportunities further up the quantum value chain.

Seeking NISQ quantum advantage could just be a natural evolutionary step on the road to future FTQC. It could be a pot of gold to kick-start a virtuous cycle. But there is an alternative point of view…aim directly for FTQC.

The direct road to FTQC

We know we can do amazing things with FTQC. In truth we have no such guarantee for what can be achieved with NISQ devices. In reality there is only one algorithm that quantum computers need to be optimised for and that is running quantum error correction. We might as well get on with that.

The journey to FTQC will be driven not just by hardware advances, but also by innovation in the quantum error correction protocols themselves. Increasingly these developments will go hand-in-hand.

Those pursuing the long term goal of FTQC must focus above all on a device that can efficiently run a suitably tuned quantum error correcting code. High fidelity remains important, but the specific noise model will matter, affecting what code is optimal. Different codes make varied demands on the geometry of qubit connectivity. Error syndromes must be decoded efficiently in real-time so that dynamic corrections can be applied.

Above all, all of this must be highly scalable. Most approaches assume that FTQC will require millions of physical qubits to create a much smaller number of ultra high quality logical qubits.

For more on quantum error correction read  Quantum Error Correction – from geek to superhero

Choosing to pursue FTQC directly is not an easy choice.  Focusing solely on the long term limits the opportunity to build bottom-up momentum in the ecosystem. The sector will surely struggle through a deep winter if there are no NISQ revenue sources to harvest.

Though focused, this is a very austere strategy requiring investors with very deep pockets, or government backing, to see to completion.

Comparing qubit technologies

Quantum pioneers are equipping themselves with a wide variety of alternative qubit technologies. As an initial overview it’s useful to start with a summary of the typical performance states achieved today.

Qubit Snapshot

 

Supercon-
ducting Qubits

Trapped
Ions

Spins in Silicon

Photonic

Defect Centres in Diamond

Neutral Atoms

Topological Qubits

Important Variants

Tunable

Fixed Freq.

Parametric

Hyperfine
Optical
NF Microwave
GF Microwave

Si MOS Qdot Si SiGe Qdot Ge SiGe Qdot Imp. Donor

STM Donor

MBQC

CVQC

Nitrogen Vacancy

Silicon Vacancy

Hyperfine

Optical

Majorana quasiparticles

Qubit T2 Lifetimes

Short
15-100μs

Long
0.2-50s

Mixed
1μs-0.5s

Short
150μs

Long
10s

Long
0.2-10s

Long

2Q Gate Fidelity

High 99%-99.7%

High 99%-99.9%

Promising 98%

Promising 98%

Interesting 99% (88%)

Promising 97%

Extremely High

Gate Speeds

Fast
12-200ns

Mixed 
1μs-3ms

Fast
0.8-80ns

Very Fast
1ns

Slow
100μs

Intermediate 1μs

?


Environment

20mK

Vacumn

1K

4K

Ambient

Vacumn

?

Current Devices

53Q

20Q

2Q

20Q

10Q

51Q

0Q

FTQC Footprint

Building

Building

Chip

Compact

Network

Large

?

A fuller version of of this data is available on the Fact Based Insight Qubit Dashboard.

In reviewing the data a casual reader should perhaps focus on the different scale of the SI units being used. For example, 20mK is a lot colder than 1K.

Fact Based Insight believes that no simple summary is possible at this level of aggregation. Best case stats are typically not all demonstrated in the same real device or even the same variation of the technology. Casual observers should be cautious of assuming headline claims apply to all a platform’s siblings. Not all trapped ion platforms have demonstrated 99.9% fidelity. Not all silicon spin qubits use standard CMOS fabrication.

To really try to understand what the leading commercial players are doing, we need to drill down beneath the conventional top level categories. Fact Based Insight has reviewed the strengths, weaknesses, opportunities and threats facing all the main technology variants being pursued by commercial players (for the full SWOT analysis in each area follow the links after each summary paragraph below).

Superconducting circuits

Superconducting qubits

Superconducting qubits have seized an early lead in commercial quantum computing activity. Being first to demonstrate technical quantum supremacy means they are well placed to search for NISQ quantum advantage. However there are significant challenges to be faced in scaling up these systems in the extreme cryogenic environments they require. A key difference in the approaches being taken by early commercial players is how exactly they choose to drive 2Q gates and the different implications for fidelity and control wiring this brings. New entrants have plenty of reason to believe there is still everything to play for in this technology.

Read full SWOT and player analysis.

Trapped ions

Trapped ions

Years of academic activity is now transforming itself into a wave of spin-outs and startups. Trapped ion proponents point to the very high fidelity gates that have been achieved and additional opportunities for qubit connectivity. However the best gates have historically required precision Raman laser setups that come with their own significant challenges. Commercial approaches vary widely on how they balance the goals of fidelity and scalability.

Read full SWOT and player analysis.

Silicon spin

Silicon spin

Silicon enjoys an unrivaled lead in human fabrication know-how and investment. Importantly, in its isotopically purified form, it is an ideal neutral material to host sensitive spin qubits. Proponents emphasise the potential of this technology to deliver chip-scale solutions where some other approaches would require ‘building-size’ machines. But spin quibts are delicate. Nanofabricating the right environment to extend their coherence life, and so improve fidelity, has been a challenge. A growing range of promising approaches are now reaching liftoff. These vary widely in terms of the fabrication technology they seek to employ, from the CMOS mainstream to ultra-cutting edge atomic level precision manufacture.

Read full SWOT and player analysis.

Photonic

Phontonics

Light is the first system where humans observed quantum phenomena and remains in many ways the best understood. Conventionally driven investment in silicon photonics has now transformed our ability to engineer light in compact devices. Photons are naturally immune to many conventional forms of noise and most operations can already be completed with very high fidelity. There is a catch. Photons don’t interact directly, so conventional 2Q gates cannot easily be constructed. Photonic startups have instead been driven to alternative, though formally equivalent, approaches to quantum computation. Photonics is also embedded as an enabling component in many other quantum technologies.

Read full SWOT and player analysis.

NV diamonds

NV diamonds

Our ability to fabricate in diamond has undergone a transformation that is perhaps not fully appreciated outside of the specialist community. Diamond is another material well suited to hosting qubits. Pioneering work with NV diamonds has often focussed on their remarkable near-term applications in quantum sensing and quantum communications. However the ability to form an expanded register of high fidelity qubits based on nuclear spins around a single defect is widening potential applications. Intriguingly this technology even holds out a realistic promise of operating at room temperature and ambient pressure.

Read full SWOT and player analysis.

Neutral atoms

Neutral atoms

Neutral atoms can be confined in 2D arrays and 3D lattice structures by laser tweezers. Because of this natural ability to mimic systems of interest, they have long been considered a leading platform for analogue quantum simulation. However recent progress in demonstrating long-range 2Q gates across such structures have drawn attention to their potential as a platform for gate-model quantum computation. While they share many of the advantages and challenges of trapped ions, they promise a significantly higher density of qubits in a single trap. Grassroots players can also expect to benefit from opportunities in quantum sensing and timing.

Read full SWOT and player analysis.

Topological qubits

Topological qubits

We don’t worry about errors in conventional computing chips because the digital transistor has hardware error correction built in. That’s why transistors replaced valves, unlocking the digital revolution. The idea behind topological qubits is to repeat this trick. We’ve long had an abstract model of how this could work. It doesn’t require the discovery of new physics or new particles per se, but we do have to fabricate special nanostructures that we expect to behave as peculiar 1D or 2D quantum systems.

Read full SWOT and player analysis.

Control and measurement

dilution fridge

As one of the first quantum platforms to start to significantly scale-up, superconducting qubit platforms have been first to face a series of control plane challenges. Complicated in this case by the need to work down to extreme cryogenic temperatures. However many qubit types will face their own version of these issues. Hardware majors have been actively seeking to develop advanced cryo-CMOS solutions able to operate in the 4K or even 20mK regime. One newcomer offers a radically different and potentially disruptive approach based on SFQ technology. As academic know-how is being packaged-up and brought out of the lab, an interesting niche has developed in quantum error characterization, calibration and optimal control. How exactly this will interface with the quantum software stack is still an open question.

Read full review.

Planting the flag

In addition to commercial strategies for realising NISQ and FTQC benefits, in the real world there is a third consideration.

Many see this new sector of technology as just too important to allow it to be completely dominated by one world power. Everyone now sees how early US leadership in microchip technology has been transformed into great wealth. It’s no disrespect to want to copy that. Leading figures in the EU Quantum Flagship programme explicitly talk about the importance of building and protecting a sovereign capability in Europe. Similar voices are heard in Australia. To foster regional success, we can expect specific geographic hubs for quantum computing expertise to emerge.

Silicon Valley in the US has been a draw for many early quantum businesses, attracting those that seek not just access to silicon fabrication expertise but also deep investment pockets. Rigetti was a local startup. PsiQ has been a notable name moving its HQ west across the Atlantic; Q-CTRL a notable name coming east across the Pacific to start an office. Equally the US has always had smart mechanisms for encouraging innovation. Not just via IARPA and the DOE programmes; IBM Q the current leading quantum software ecosystem benefits from a $2.5m p.a. contract with the US Air Force Research Labs (themselves home to an impressive quantum programme). Such relatively small sums help seed the activity that bring in others and help give corporate players the confidence to invest. The NQI and QED-C will strengthen these processes.

Canada has been an early mover in the quantum sector. It already boasts increasingly established players such as D-Wave and Xanadu. It is home to Creative Destruction Lab, the well respected seed-stage programme for science and technology based startups. It’s been quick to focus on building-up a ‘Quantum Valley’ at Waterloo near Toronto.  This features the well-respected IQC, R&D facilities and start-up support such as the Quantum Valley Ideas Lab and Quantum Valley Investments.

QuTech, hosted at TUDelft in the Netherlands is a notable example of a multi-stranded initiative making world class progress in a series of qubit technologies including superconducting qubits, silicon qubits, topological qubits and NV-centres (all sharing at least some common requirements in terms of control and cryogenics). Notable partners include Intel and Microsoft. Its not surprising to find QuTech also at the centre of the much wider vision to build a Quantum Internet in Europe, nor in the Quantum Delft initiative to build-out a quantum business ecosystem (Bluefors the cryogenics speciality is a notable participant). QuTech is cooperation between TUDelft and TNO, the Dutch Organisation for Applied Scientific Research. QuTech recently launched the first European online Quantum Computing hardware platform Quantum Inspire.

NQCC is a headline feature of phase II of the UK NQTP. This provides a centre for the quantum computing ecosystem in the UK with a flexible vision to support any technologies with the potential to scale. QT Digital Week saw Innovate UK announce a further wave of 38 projects that have helped attract others to participate in the UK quantum ecosystem, notably Rigetti, Hitachi, Toshiba and Teledyne e2V, as well as supporting startups such as OQC, Quantum Motion, Universal Quantum, Oxford Ionics, Orca Computing, Duality Quantum Photonics and SeeQC [ ]. The work of the QCS Hub (formerly NQIT) continues in providing a focus for leading edge work across the UK’s strong academic sector.

The NQCC interim leadership team notably includes commercial experience from Michael Cuthbert of Oxford Instruments (a leader in cryogenic technology), together with academic insight from Simon Benjamin of Univ. of Oxford. In a sector increasingly driven by deep specialism, Benjamin stands out for being published on both leading edge NISQ algorithmic advances and pragmatic routes to FTQC; equally his collaborators have spanned trapped ions, superconducting qubit and silicon qubit approaches. The only question must be whether the resources are available to fund such an ambitious vision.

Germany has just announced an impressive €2b investment in quantum technology. We can expect a quantum computing hub to develop, pulling in players such as IQM and others. Could Forschungszentrum Jülich, with a quantum programme headed by recently appointed Frank Wilhem-Mauch, (also co-ordinator of the EU Flagship OpenSuperQ project) be a location to watch?

France has a notable cluster of quantum silicon and cryo-CMOS expertise centred in Grenoble. Three laboratories there CEA-Leti, CEA-INAC and CRNS-Néel have joined forces around the QuCube project to develop a silicon-based quantum processor, with EU funding support. The project has won initial €14m funding over six years. Further investment in this area as part of a larger French quantum technology programme can be expected to follow.

China is the location focusing minds in many other countries. The NLQIS in Hefei will be the world’s largest quantum research facility. This HQ location will focus on photonic, NV diamond and silicon spin qubit technology as well as quantum communications and quantum sensing. NLQIS Beijing branch will focus on theory, trapped ion and topological qubits. The Shanghai branch will focus on superconducting qubits and neutral atoms, and free-space quantum communication. China is already a global leader in quantum communication technology, its position in quantum computing is set to advance.

Important to nurturing national advantage will be the promotion and protection of IP rights. North America has traditionally led the way on quantum computing patents, and China on quantum communication patents [ ]. However evolving technology platforms provide an opportunity to challenge this status quo. The scope of what can be protected and the length and value of that protection varies by geography. The prospect of long timescales before full commercial exploitation further complicates strategy. Fact Based Insight thinks this remains an underdeveloped aspect of many national quantum programmes.

High technology translates into both economic and geopolitical power. Fact Based Insight believes it will become increasingly important to look not just at the individual quantum computing integrator, but also the ecosystem in which they are embedded.  In this race there may be a global winner, but with so many countries keen to develop their own capability there will also be significant rewards for fast followers.

Actions for Business

Here we list the key indicators, metrics and milestones that Fact Based Insight is looking for. It’s an imperfect mix and we trust that better measures will emerge as the industry grows. Investors and early business adopters should consider how our view compares to the roadmaps of their own portfolio companies and partners.

For those setting out

  1. Demonstrate high fidelity qubits in the lab:
    • Lead indicators: Qubit lifetimes, T1 and T2
    • Key metrics: 2Q gate fidelity, T2/gate speed
    • Other metrics: SPAM fidelity, 1Q gate fidelity
    • Milestone: 2Q gates at least 99%, but ideally 99.9%+
  2. Operate small 10-100Q multi-qubit devices:
    • Lead indicator: trend in 2Q gate fidelity
    • Metric: QV is a useful measure

For those seeking NISQ quantum advantage

  1. Demonstrate unique capabilities:
    • Indicators: qubit connectivity, native gate set, error mitigation
    • Intermediate milestone: quantum supremacy is a good start, but watch for focus on an artificial problem starting to drive device design decisions
    • Metric: QV probably remains a useful guide, but not an absolute bar
  2. Develop an active programme of partnering with researchers and pioneer business adopters:
    • Indicator: quality is probably more important that quantity
    • Milestone: arXiv papers identifying the scale of problem that can/needs to be tackled
  3. Develop device designs targeted to specific NISQ applications:
    • Lead indicator: explainable intuition about why quantum offers an exponential speedup
    • Indicators : specific algorithmic benchmarks will emerge (VQE, QAOA, GBS, QA, etc)
  4. Deliver value:
    • Lead indicator: showcase demonstrations
    • Milestone: Deployment in routine business use

For those following the road to FTQC

  1. Demonstrate logical qubits
    • Indicator: lifetime longer than any of its individual parts
    • Metric: ratio of physical/logical qubits
    • Milestone: universal set of fault tolerant operations
  2. Develop a blueprint for an optimised hardware/error correcting code architecture:
    • Indicators: threshold analysis for specific code, noise model and connectivity; scalable control solution; fast decoding, low module interconnect latency
    • Metric: footprint size of a 200 logical qubit machine
      Milestone: firm module design resolving control environmental issues
  3. Scale-up to target module:
    • Indicators: number of logical qubits
    • Competitive differentiator: logical gate speed (versus other quantum tech at similar scale)
  4. Advanced features:
    • QRAM
    • Quantum Internet interconnect

Fact Based Insight is already looking forward to the next set of conferences tracking the steps of the quantum pioneers: IQT Europe 26-30 Oct and Q2B 8-10 Dec.

References

[1]
A. W. Cross, L. S. Bishop, S. Sheldon, P. D. Nation, and J. M. Gambetta, “Validating quantum computers using randomized model circuits,” Phys. Rev. A, vol. 100, no. 3, p. 032328, Sep. 2019, doi: 10.1103/PhysRevA.100.032328. Available: https://link.aps.org/doi/10.1103/PhysRevA.100.032328. [Accessed: Apr. 22, 2020]
[1]
[1]
“£70 million funding to secure UK position as a world-leader in quantum technology,” GOV.UK. Available: https://www.gov.uk/government/news/70-million-funding-to-secure-uk-position-as-a-world-leader-in-quantum-technology. [Accessed: Jul. 15, 2020]
[1]
M. Travagnin, European Commission, and Joint Research Centre, Patent analysis of selected quantum technologies. 2019. Available: http://publications.europa.eu/publication/manifestation_identifier/PUB_KJNA29614ENN. [Accessed: Jul. 15, 2020]
[1]
E. National Academies of Sciences, Quantum Computing: Progress and Prospects. 2018. doi: 10.17226/25196. Available: https://www.nap.edu/catalog/25196/quantum-computing-progress-and-prospects. [Accessed: Jun. 23, 2020]
[1]
“Technical Roadmap for Fault-Tolerant Quantum Computing | NQIT.” Available: https://nqit.ox.ac.uk/content/technical-roadmap-fault-tolerant-quantum-computing. [Accessed: Jun. 23, 2020]
[1]
D. P. DiVincenzo and IBM, “The Physical Implementation of Quantum Computation,” arXiv:quant-ph/0002077, Apr. 2000, doi: 10.1002/1521-3978(200009)48:9/11<771::AID-PROP771>3.0.CO;2-E. Available: http://arxiv.org/abs/quant-ph/0002077. [Accessed: Jun. 23, 2020]

Fact Based Insight would like to thank Olivier Ezratty for the use of the qubit images used in this series of posts and refer users who are interested in a French language quantum overview to his upcoming epublication Comprendre l’informatique quantique édition 2020.

Quick Navigation

Overview / Superconducting / Trapped Ions / Silicon Spin / Photonic / NV Diamonds / Neutral Atoms / Topological / Control / Dashboard

David Shaw

About the Author

David Shaw has worked extensively in consulting, market analysis & advisory businesses across a wide range of sectors including Technology, Healthcare, Energy and Financial Services. He has held a number of senior executive roles in public and private companies. David studied Physics at Balliol College, Oxford and has a PhD in Particle Physics from UCL. He is a member of the Institute of Physics. Follow David on Twitter and LinkedIn