Inside Quantum Technology

Applying Ideas From Gaming Hacks to Quantum Computing

(SciTechDaily) Pacific Northwest National Labs’ quantum algorithm theorist and developer Nathan Wiebe is applying ideas from data science and gaming hacks to quantum computing. Wiebe is forging ahead writing code that he is confident will run on quantum computers when they are ready. In his joint appointment role as a professor of physics at the University of Washington, Wiebe is training the next generation of quantum computing theorists and programmers. On one hand, Wiebe laments that “there’s such a huge gulf between where we are right now versus where we need to be.”
Coding for quantum computers requires leaps of imagination that can be daunting on one level, but Wiebe points out that any 15-year-old Minecraft enthusiast would have no trouble understanding the basics of how it works. The wildly popular building block video game has spawned a community of enthusiastic coders who create virtual computers inside the game environment. Minecraft coders have simulated real-world physics and created virtual calculators, among other feats. The Minecraft universe has its own internal rules and some of them don’t quite make sense – much like some of the rules of the quantum universe don’t seem clear, even to physicists.
Despite not understanding why the rules in Minecraft work the way they do, players instead learn how the physics of Minecraft work and further how to exploit that knowledge to perform tasks the games creators may not have intended. Quantum computer programmers have a similar challenge. They are faced with the strange rules of quantum mechanics and try to find creative ways to “hack”.
One of the first useful quantum technologies is likely to be quantum sensors – devices that use quantum signals to measure things like temperature and magnetic fields. Wiebe worked with an international team of colleagues to apply machine learning techniques to a tricky problem in quantum sensing. Wiebe and his colleagues solved the problem by running the experiments at room temperature and then applying an algorithm that used techniques from data analytics and machine learning to correct for the error-prone, noisy signal.

Exit mobile version