Cutting Through the Noise: AI Enables High-Fidelity Quantum Computing
(Phys.org) Researchers led by the Institute of Scientific and Industrial Research (SANKEN) at Osaka University have trained a deep neural network to correctly determine the output state of quantum bits, despite environmental noise. The team’s novel approach may allow quantum computers to become much more widely used.
To make quantum computing reliable enough for consumer use, new systems will need to be created that can accurately record the output of each qubit even if there is a lot of noise in the signal.
Now, a team of scientists led by SANKEN used a machine learning method called a deep neural network to discern the signal created by the spin orientation of electrons on quantum dots. “We developed a classifier based on deep neural network to precisely measure a qubit state even with noisy signals,” co-author Takafumi Fujita explains.
In the experimental system, only electrons with a particular spin orientation can leave a quantum dot. When this happens, a temporary “blip” of increased voltage is created. The team trained the machine learning algorithm to pick out these signals from among the noise.
“Our approach simplified the learning process for adapting to strong interference that could vary based on the situation,” senior author Akira Oiwa says. The team first tested the robustness of the classifier by adding simulated noise and drift. Then, they trained the algorithm to work with actual data from an array of quantum dots, and achieved accuracy rates over 95%. The results of this research may allow for the high-fidelity measurement of large-scale arrays of qubits in future quantum computers.