Inside Quantum Technology

Quantum Hardware Outlook 2020: Part One

(FactBasedInsight) Welcome to IQT-News‘ first in a series “Helping Businesses Prepare for Tomorrow” from FactBasedInsight.

2019 saw Google finally demonstrate quantum supremacy. Competitors such as IBM reminded us that this was just an opening skirmish in what will be a long campaign. In 2020 we will see paths divide as competing companies and technologies face-up to the quantum chasm blocking the way to large scale quantum computers.

The Hard Road to Quantum Computation

In the end Google was able to use 53 superconducting qubits on its Sycamore device to perform a random sampling calculation in 200 seconds that it projected would take the world’s most powerful supercomputer 10,000 years [71]. IBM pointed out that with some ingenuity and plenty of brute force this could perhaps be brought down to 2.5 days. However the essential scientific and engineering point of Google’s success remains – computation with insanely complex quantum amplitudes actually works; the Turing machine has been overturned – quantum algorithms will inevitably outperform their conventional counterparts whenever appropriate computational complexity is present.

Overall however, 2019 reminds us of the considerable engineering challenges that quantum hardware developments still face. 2018 saw an enthusiastic flurry of chip announcements with ever climbing qubit-counts from Intel’s 49-qubit Tangle Lake to Google’s 72-qubit Bristlecone and a 128-qubit Aspen from Rigetti. Notably we still haven’t seen performance results from any of those designs. Indeed, the fact that Google made a slightly compromised quantum supremacy announcement based on the smaller Sycamore device suggests that it’s been difficult to achieve the required level of performance with Bristlecone. As IBM explains, controlling crosstalk is a key challenge for these circuits, currently requiring trial and error in individual unit fabrication [60].

IBM has been notable amongst competitors in providing detailed measured performance specs on many of the devices it has developed. It also promotes the measurement of ‘quantum volume’ as a more insightful indication of device capability, accounting for gate & measurement errors, cross talk, connectivity and circuit compiler efficiency. The IBM Q roadmap seeks to double QV each year, a milestone it again achieved this year: the fourth generation 20-qubit processors in the IBM Q System One delivers a QV of 16, compared to the QV of 8 demonstrated by the IBM Q 20 Tokyo processor.

Crossing the Quantum Chasm

In 1943 Colossus became the world’s first programmable electronic computer. But the modern digital revolution only gathered pace when transistors replaced valves to give us reliable digital bits and integrated circuits replaced wires to allow efficient scaling-up.

2019 has seen a wide number of alternative qubit technologies and architectures continue to be developed. Each offers different trade-offs in terms of qubit lifetime, fidelity, connectivity, gate-set speed and ease of scalability. Each is at a differing stage of development. None are mature.

Most experts expect the field to pass through an era of experimentation with a wide variety of noisy intermediate scale quantum (NISQ) devices and then on to a longer term future where increased scale and the application of quantum error correcting codes give us fault tolerant quantum computation (FTQC).

To try to achieve practical results with NISQ devices, groups can target specific characteristics: for example the very best fidelity but at the expense of scalability (potentially an advantage for laser driven trapped ions); or optimising for specific problems but at the expense of full programmability (a niche for special purpose quantum computers – digital and analogue quantum simulators). The quantum annealing approach pursued by D-Wave is a parallel example. This doesn’t pursue the gate-model architecture used by other competitors, but one focused on a particular class of problems.

To pursue the quickest path to FTQC, groups must target more balanced goals. Qubits of sufficient fidelity and sufficient connectivity need to be paired with a suitable error correcting protocol. The protocol cycles required to operate multiple physical qubits as one logical qubit need to be optimised. A practical device architecture is required that allows both the quantum and conventional components to be scaled-up in harmony. A microarchitecture that supports specialised functions such as magic state production and QRAM needs to be developed. Breakthroughs in error correction, still a rapidly developing field [59], may be just as important as progress on hardware.

Fact Based Insight believes differing strategies are likely to develop and this will not be a linear competition. Only those groups with the very deepest investment pockets will be able to optimally pursue all NISQ and FTQC goals at the same time. A diversity of approaches will continue.

In his annual State of the Quantum Nation address at Q2B 2019, John Preskill (Caltech) continued to urge a focus on the long term prize “We must cross the ‘quantum chasm’ from hundreds to millions of physical qubits. Mainstream users may need to be patient … Progress toward FTQC must continue to be a high priority for quantum technologists”.

Next Week in Series:  What to Watch in Quantum Hardware in 2020

Exit mobile version