Inside Quantum Technology

New Research from Alice & Bob Shows Abilities of Cat Qubits for Error Correction and Error Prevention

New research from Alice & Bob uses cat qubits to boost error correction within a quantum computing system.

New research from Alice & Bob uses cat qubits to boost error correction within a quantum computing system. (PC Alice & Bob)

Because of the fragility of quantum computing systems, errors can be pretty frequent. Quantum companies worldwide are developing methods to avoid errors or make the hardware more error resistant. While many of these businesses are focused on noise, Alice & Bob, a Paris-based quantum company is instead looking at the qubits themselves. The company is developing what are known as cat qubits, these primary components may be the key to combining error prevention and error correction; a recent paper by the Alice & Bob team shows cat qubits offer a potential solution to these problems. “These qubits belong to the bosonic class of qubits,” explained Jérémie Guillaud, the company’s Chief of Theory. “This class of qubits is unique because they have a specific mathematical structure that provides room to protect any quantum information that is encoded.” Because of the unique properties of these cat qubits, the researchers at Alice & Bob are hopeful that their system can make significant advances in error correction and error prevention.

Using Cat Qubits for Error Prevention and Error Correction

Guillaud believes that not only are cat qubits beneficial for error correction, but they can also help design a system with error prevention in mind. “There are two paradigms in the current community,” Guillaud explained. “The first is to improve the hardware without error correction. This means no error prevention either, and usually refers to the NISQ era. The other paradigm focuses on error correction and error prevention of the system itself. Error correction, once operated below its threshold, gives you a potential suppression of the logical error rate. So, you can very quickly get access to much lower error rates.” Because experts have predicted a value increase in the billions of dollars for quantum computing after error correction, there is a monetary incentive for the industry to work on developing these systems. To do so, Guillaud is moving to the next step of this research, testing the theory in Alice & Bob’s laboratories. “For our cat qubits, the next big milestone is demonstrating the protection of quantum information over macroscopic timescales (seconds/minutes/hours), where it is currently limited to at most milliseconds in the best systems, microseconds being the standard,” he added.

Reaching for Error Correction

In their new paper, the Alice & Bob researchers found that, in theory, by using cat qubits, fewer qubits are needed for error correction. As Guillaud stated; “On a full-size quantum computer, you would have something like 90% of the qubits that would just be to control the spread of errors, and only 10% of qubits somehow doing the work of carrying through the computation.” This means that only a fraction of the computing power of a quantum computer is focused on the actual analysis, making the machine rather inefficient. The current number of qubits needed for mitigating the error correction is around 20 million, which Guillaud explained, was the number Google was working on. But with his team, Guillaud was able to scale that number down to merely 350,000 qubits, making the entire quantum system more manageable and affordable. This has huge implications for those looking to develop error correction protocols. With fewer qubits needed in the system, less interference will happen between qubits, making it easier for a company to scale up the hardware.

Kenna Hughes-Castleberry is a staff writer at Inside Quantum Technology and the Science Communicator at JILA (a partnership between the University of Colorado Boulder and NIST). Her writing beats include deep tech, the metaverse, and quantum technology.

Exit mobile version