(Phys.org) “Think what we can do if we teach a quantum computer to do statistical mechanics,” posed Michael McGuigan, a computational scientist with the Computational Science Initiative at the U.S. Department of Energy’s Brookhaven National Laboratory.
Ludwig Boltzmann a renowned physicist had to vigorously defend his theories of statistical mechanics in the late 19th century. Boltzmann proffered his ideas about how atomic properties determine physical properties of matter in the late 19th century. Boltzmann’s factor, which calculates the probability that a system of particles can be found in a specific energy state relative to zero energy, is widely used in physics.
It took a sea change to show Boltzmann was right, computer scientists now are at the precipice of a new computing wave, making the leap from supercomputers and bytes to quantum systems and quantum bits (or “qubits”).
“Our experiment shows quantum systems have an advantage of representing real-time calculations exactly rather than rotating from imaginary time to real time to find a result,” McGuigan explained. “It offers a truer picture of how a system evolves. We can map the problem to a quantum simulation that lets it evolve. McGuigan and his student/coauthor Raffaele Miceli used a quantum computing testbed provided by way of Brookhaven Lab’s access agreement to IBM’s universal quantum computing systems, through the IBM Q Hub at Oak Ridge National Laboratory.
Miceli and McGuigan demonstrated how to implement the quantum algorithm for thermo field dynamics for finite temperature on a simple system involving a few particles and found perfect agreement with the classical computation. “Our experiment shows quantum systems have an advantage of representing real-time calculations exactly rather than rotating from imaginary time to real time to find a result,” McGuigan explained.