IBM in recent months has talked a lot about circuit knitting for quantum computing, the notion of using classical computing techniques to find different ways of getting quantum circuits to run better and more efficiently on NISQ-era quantum hardware.

The company earlier this week announced the publishing of a paper that outlines one of its more recent innovations in this vein – entanglement forging. The technique, which IBM officials briefly mentioned during an information-dense November presentation announcing the company’s Quantum Serverless programming model, uses classical computing in post-processing to help simulate a quantum system using half as the number of qubits the quantum computer has. This could allow quantum processors to tackle larger problems than they typically would be able to handle.

For example, IBM used entanglement forging “to represent ten spin orbitals of the water molecule on five qubits of an IBM Quantum processor in the most accurate variational simulation of the ground-state energy using quantum hardware to date,” IBM stated in a blog post, which further explained that the approach works “by dividing systems into two weakly entangled halves, modeling those halves separately on a quantum computer (i.e. first one half, then the other), and then using classical resources to calculate the entanglement between them.”

Entanglement forging isn’t just for chemistry problems either. It can applied to a wide variety of problems to be tackled by quantum processors, according to Sarah Sheldon, an IBM Quantum research staff member and co-author on the paper

Sheldon told IQT News, “What it does is probably opening up more problems that people could look at on near-term hardware. There’s more research to be done on entanglement forging how to apply it to other problems. It is a tool that we think there’s more research that we can build onto it.” Sheldon said entanglement forging likely would become a component of IBM’s Qiskit Runtime programming architecture.

The origins of entanglement forging lie in IBM’s ongoing efforts to get its systems to do more and tackle harder problems without changing much on a given system, she said, adding that even as IBM continues to raise the bar with 1,000-qubit systems and beyond, techniques like entanglement forging and error mitigation can continue to strengthen the relevancy and value proposition of systems with smaller numbers of qubits. “This is something that’s going to be relevant going forward,” she said.

Entanglement forging also represents a prime example of how quantum computing for the foreseeable future will continue to leverage a hybrid of classical and quantum processing techniques to maximize its potential.

“I think there’s always going to be some classical computing involved,” Sheldon said. “And you can imagine, there’s this trade off, right? How much are you doing on the quantum computer versus how much are you doing on the classical computer? And, you know, if you’re really pushing both, you may need more and more classical resources. When we talk about this kind of hybrid model, that’s really all of quantum computing, right? Because we have to interact with these quantum systems using our classical controls. Even when we get to error correction we’re going to have a step of decoding, and we’ll have classical controls that implement the corrections. So understanding and really optimizing this interface between quantum and classical is going to be important now and into the future.”