Quantum Annealers: A First Step on the Road to Quantum Computer Commercialization
Annealing, the process of improving the quality of metals and alloys by heating them and slowly cooling again, is possibly one of the oldest optimization techniques known to mankind. With the world of big data growing each day, algorithms inspired by this ancient technique have emerged. Simulated annealing algorithms are an important tool in solving complex optimization problems, where a quick heuristic solution is preferable to an exact one that simply takes too long to find. One example is the travelling salesman problem, important for finding the shortest path connecting multiple points.
Quantum annealers make use of quantum mechanics, allowing an array of qubits to naturally evolve according to the Schrodinger Equation with an external field. The qubits end up with high probability in a configuration corresponding to the solution of the problem in question. Quantum annealers have already been shown to outperform their classical counterparts on certain problems.
Put another way, quantum annealers are a station on the way to the real quantum computer. The real thing would be able to implement Shor’s Algorithm, for example and would therefore be able to unleash a torrent of fears about quantum decryption across the dataverse. Nonetheless, many data-driven companies are invested in the technology, including Google, NASA, 1QBit, the USRA and DNA-SEQ. Quantum annealers are becoming involved in solving real-world problems, such as in the domains of finance, cancer research ad well as most likely in the military and intelligence communities
The first commercial quantum annealer appeared on the market in 2011. D-Wave One, produced by D-Wave Systems, was built with a processor operating 128 qubits. Despite the seemingly small number of qubits, an analytical study showed that the annealing process performed as well as a highly-optimized classical annealing algorithm on a high-end Intel CPU.
Continuing to improve their devices, the company released D-Wave Two X in 2015, this time utilizing 1,000 qubits. This model has already shown significant quantum speedup on certain problems. Google, operating the Two X model jointly with NASA, reported an improvement in performance by a factor of 108 when compared to classical annealing and Quantum Monte Carlo – a group of algorithms simulating quantum annealing, but running on classical hardware. Other firms are also beginning to develop quantum annealers – notably, Fujitsu.
Even if quantum annealers are not quite quantum computers, there are many applications for which they can already prove useful and more will be found. In addition, Inside Quantum Technology, believes that quantum annealers serve the market function of preparing the market for more advanced quantum machines in the future. But we suspect, depending on price points, quantum annealers will be around for a while.
However, there may be a technological limit on the scalability of quantum annealing systems. A recent study by the University of Southern California has shown a link between the size of a problems input and the minimal temperature at which an annealer has to operate in order to solve it. D-Wave devices already operate at incredibly low temperatures around 0.015K and decreasing it further may pose a technical difficulty. That being said, low-temperature solutions are important for all domains of quantum technology, meaning that progress is likely only a question of time.
To learn more about quantum annealers and how quantum technology fits into big data problems, attend the Inside Quantum Technology Conference, which will be held at the Hynes Convention Center, Boston, March 19-21. Also note that later in 2019 Inside Quantum Technology will be publishing a report on the hardware and software strategies of leading quantum computer firms.