888-384-7144 info@insidequantumtechnology.com

Applying Particle Physics Methods to Quantum Computing at Berkeley Lab

By IQT News posted 06 Nov 2020

(Phys.org) A team of physicists and computer scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has successfully adapted and applied a common error-reduction technique to the field of quantum computing.
Ben Nachman, a Berkeley Lab physicist who is involved with particle physics experiments at CERN as a member of Berkeley Lab’s ATLAS group, saw the quantum-computing connection while working on a particle physics calculation with Christian Bauer, a Berkeley Lab theoretical physicist who is a co-author of the study. ATLAS is one of the four giant particle detectors at CERN’s Large Hadron Collider, the largest and most powerful particle collider in the world.
“At ATLAS, we often have to ‘unfold,’ or correct for detector effects,” said Nachman, the study’s lead author. “People have been developing this technique for years.”
“We realized that current quantum computers are very noisy, too,” Nachman said, so finding a way to reduce this noise and minimize errors—error mitigation—is a key to advancing quantum computing. “One kind of error is related to the actual operations you do, and one relates to reading out the state of the quantum computer,” he noted—that first kind is known as a gate error, and the latter is called a readout error.
The latest study focuses on a technique to reduce readout errors, called “iterative Bayesian unfolding” (IBU), which is familiar to the high-energy physics community. The study compares the effectiveness of this approach to other error-correction and mitigation techniques.

Subscribe to Our Email Newsletter

Stay up-to-date on all the latest news from the Quantum Technology industry and receive information and offers from third party vendors.

0