**(FactBasedInsights.QuantumAlgorithmsOutlook2020:PartTwo)**

Welcome back to this IQT-News series from FactBasedInsights. Today’s edition is the second half of Dr. David Shaw’s discussion of Quantum Algorithms.

**Quantum Simulation – The Killer App**

The ability to perform calculations in quantum chemistry, physics and materials design is the original, and for many still the slam dunk killer application for quantum computers. This may sound niche to the lay person but it shouldn’t. Chemistry can be dull, but it’s vitally important for almost every aspect of the modern world, from pharmaceuticals to industrial processes to renewables technologies. The only reason we don’t routinely use such quantum chemistry calculations is that their native quantum-complexity is beyond conventional computers.

In 2019 Google presented detailed work on what it would take to perform calculations on a large molecule of significant interest. FeMoco (Fe7MoS9C) is the active site within the Nitrogenase enzyme responsible in nature for the catalytic conversion of nitrogen gas into ammonia (fertilizer). If we could better understand FeMoco then potentially we could improve the energy intensive Haber process used for the industrial production of fertilizer. This would not just be a commercial breakthrough, but also one with very positive implications for feeding the planet and limiting climate change. Google’s work currently shows how an error corrected device of about 1 million qubits could be used to perform relevant calculations. Unfortunately, even if further progress brings that down to 100,000 qubits, this approach still requires FTQC.

It’s no surprise that much work is going into what might be achieved in the NISQ era. Promising techniques pursued in 2019 included the hybrid quantum-classical VQE; error mitigation (e.g. measuring and extrapolating out the effect of errors); and stabiliser techniques (e.g. spotting errors that violate physics in the system being simulated).

Variational Quantum Eigensolver (VQE) is a technique that can be used to estimate molecular ground state energies. Various groups are working to extend this to excited states, opening up applications in predicting optical spectra and reaction rates [49]. Riverlane have extended VQE into a Vibrational Quantum Deflation algorithm for excited state calculations. CQC have shown how symmetries can be exploited to allow calculations for the CH2 radical. IBM has leveraged simplifications inspired by conventional computational chemistry techniques to execute trial calculations for the LiO2 dimer.

As things stand, commercially useful calculations in quantum chemistry and materials design applications are a possibility but not a certainty in the NISQ era.

**Scenario Simulation – a flexible analytic tool**

Monte Carlo simulation methods are used frequently in scientific and analytic applications and in particular in the Financial Services industry.

Quantum Amplitude Estimator (QAE) can be used to provide a (quadratic) speedup over traditional techniques of Monte Carlo simulation. In 2019 IBM demonstrated how QAE can be used to execute common financial portfolio tasks including pricing a security and calculating the common risk metrics VaR and CVaR [83]. Impressively the pricing demonstration was run on a real 5-qubit Yorktown processor. While not yet out-performing conventional processing, this experiment does point to specific applications that businesses will readily understand.

Again realisable benefits are not yet proven, but they may be accessible with improved NISQ devices.

Bob Sutor (IBM) comments “This is a great time for businesses to get quantum ready. Work with your team to identify challenges in your current business that depend on known bottlenecks like Monte Carlo simulation. Let your team learn how quantum helps and build the know-how to seize quantum advantage as improved hardware becomes available.”

**Optimisation – a wide area of opportunity**

Optimisation problems are a very general area where quantum algorithms promise significant speedups (though typically quadratic rather than exponential ones). Interest in this area is strong because of the obvious fit with a very wide range of business problems, from logistics to portfolio management, from feedstock planning to managing telecoms networks.

Several well understood algorithms promise wide applicability algorithms (e.g. Amplitude Amplification including Grover’s algorithm). Unfortunately these require that the input data is stored in QRAM. A further complication is that this threatens to introduce non-trivial input/output bottlenecks. Little commercial activity has yet addressed the challenge of engineering QRAM. Most place it on a timeline similar to FTQC.

Again hybrid quantum-classical approaches are being considered for the NISQ era. Promising techniques pursued in 2019 included Quantum Annealing (QA) and the Quantum Approximate Optimisation Algorithm (QAOA).

Quantum Annealing (QA) seeks to solve a particular class of (binary) optimisation problems. The D-Wave System 2000Q uses this approach for optimisation problems and also for many other problems that can be mapped onto this model. Some pilot studies such as the recent successful field trial by Volkswagen routing buses through Lisbon traffic look near to routine operational deployment [73]. The success of this approach has spurred competition from quantum-inspired conventional systems such as Fujitsu’s Digital Annealer and Toshiba’s SBM.

Quantum Approximate Optimisation Algorithm (QAOA) is a technique designed to tackle (binary) optimisation on gate-model NISQ devices. Recently IBM and Barclays have collaborated to show how QAOA can be generalised to problems that include binary variables, continuous variables and constraints such as inequalities [84]. This potentially opens up a wider range of business problems. The fascinating example identified by Barclays is the back office financial services task of optimising financial settlements processing.

Clients already seem to be benefiting just from the challenge of thinking about their optimisation problems in new ways. Whether we will see provable quantum hardware advantage in the NISQ era is again not a certainty.

**Machine learning – hot but unproven**

Quantum machine learning is considered by many to be a major opportunity area. However again, the most often cited algorithms (e.g. HHL) require the availability of QRAM and so are probably out of immediate reach.

The opportunities currently most discussed for NISQ era applications are hybrid quantum-classical training, particularly where it might be beneficial to recognise uniquely quantum correlations in datasets. Where might such datasets be found? Intriguingly this may be just the challenge we face when working with sampled output from variational algorithms such as VQE, or in controlling and interpreting the output of quantum sensors.

In 2019, Rigetti have emphasised their strategy of providing an execution environment specially tailored for iterative execution of hybrid quantum-classical variational algorithms on near term devices.

Variational Quantum Linear Solver (VQLS) is a new algorithm for solving linear systems (a common task in machine learning) developed by members of the Rigetti community.

Rigetti have also been working with reinforcement learning to train an AI to create quantum programmes for other applications such as combinatorial optimisation. Trials with an Aspen quantum processor have demonstrated variational quantum programmes that are shorter and more robust to noise.

There are many exciting ideas in quantum machine learning. However, the extent to which value can be delivered by NISQ devices is not proven.

**Quantum winter**

FTQC certainly promises remarkable business and societal value. Some quantum-inspired value is already being seen by early adopters now. At least some early quantum applications seem set to follow. However there remains a danger that in the context of the hype the actual bottom-line value delivered in the NISQ era will disappoint.

This could be a long wait. Some have speculated that the sector will have to endure a quantum winter while early investors wait to see returns. In the absence of progress such a chill could suddenly make it difficult for emerging players to secure seed and follow-on investment.

**To watch in 2020**

*Provably Random Numbers* – Google has bought the IP rights for Aaronson’s (still unpublished) protocol on a non-exclusive basis. Will Google use one of its new quantum devices to launch a remotely certifiable public randomness service? Unlike the NIST randomness beacon, we wouldn’t have to trust the operator.

*Proof-of-quantum blockchain* – Will someone work-up Provable Random Numbers into a fully-fledged proof-of-stake blockchain proposal? Cryptocurrencies are a potential rival to sovereign fiat currencies and people of good conscience differ markedly on whether that is a good or a bad thing. Expect controversy.

*Hybrid quantum-classical heuristics* – Variational techniques such as VQE, QAE and QAOA show promise. Expect to see algorithms being demonstrated on real devices for increasingly non-trivial cases. Watch out for specific indications of the scale of calculation and required NISQ device specs to address commercially relevant problems.

*Quantum annealing* – D-Wave strategy has a characteristic focus on user-led trials, now standing at 200+ and with several now near production-ready: traffic routing optimisation (Volkswagen), factory automation routing (Denso), hotel recommendations (Recruit Group). D-Wave Leap is in poll position to be the first quantum cloud service to host a quantum algorithm in routine operational business use.

*Quantum machine learning* – Like conventional machine learning this field will likely proceed very much through trial and error. Watch out for experimentation with increasingly powerful early processors and environments making a difference.

*Quantum memory* – This key part of the quantum hardware architecture has so far seen little open commercial activity. Startups Bra-Ket and ColdQuanta have been associated with aspirations in this area. Could we see an announcement that puts this, and the algorithms that depend on it, on a firm roadmap?

*Computational complexity frontier* – This remains a notoriously difficult area. Recent claims of a proof to the long standing Aaronson-Ambainis conjecture, that [exponential] quantum speedups require structure, have been withdrawn. Research into computational complexity remains very active. Any big breakthrough on a foundational question such as BQP vs NP-complete, BQP vs NP∩co-NP, or even P vs NP itself might baffle non-experts but have big implications for the quantum sector.

**Overview – Next**

**Stay Tuned for Next Week in the Series: Quantum Software Outlook **