Quantum News Brief November 4: ParityQC awarded contract by the German Aerospace Center (DLR);D-Wave extends business value of industry-first quantum hybrid solver with new features supporting weighted constraints and presolve techniques; CU Boulder research group advances quantum sensing with a new model in optical fibersr & MORE.
ParityQC awarded contract by the German Aerospace Center (DLR)
The appointment for this initiative comes at a time of impressive growth for ParityQC. In the two and a half years since its founding, the company managed to evolve from a small spin-off of the University of Innsbruck to one of the main players in the quantum computing industry, while still being an Austrian-only owned company. At the core of ParityQC’s technology is the patented ParityQC Architecture. Its potential was recognized early on by the world-renowned microprocessor pioneer Hermann Hauser, who is an investor of ParityQC. “ParityQC’s unique architecture for quantum computers will set new standards for how highly scalable quantum computers will be built within the next decade” state Magdalena Hauser and Wolfgang Lechner, co-founders and CEOs of ParityQC.
The projects will develop through different phases. ParityQC, NXP Semiconductors and eleQtron will first work on the preliminary project, that involves building a 10-qubit demonstration model for users to gain experience with ion trap systems and advance their development.
D-Wave extends business value of industry-first quantum hybrid solver with new features supporting weighted constraints and presolve techniques
The updated constrained quadratic model (CQM) hybrid solver from D-Wave enables quantum developers to more accurately model problems where it is not possible to satisfy all constraints. It expands the addressable use cases across various industries, e.g. logistics (employee scheduling), manufacturing (bin packing), and financial services (portfolio optimization).
In addition to supporting weighted constraints, the updated CQM solver introduces a new set of fast classical algorithms that reduces the size of the problem and allows for larger models to be submitted to the hybrid solver. Presolve techniques remove unnecessary variables and constraints to achieve a cleaner dataset, resulting in better quality solutions by narrowing the problem set/size and streamlining problem formulation. These techniques are now automatically applied to all CQM problems in the CQM solver in Leap and are also available in the Ocean SDK.
Click here to see full news release.
CU Boulder research group advances quantum sensing with a new model in optical fibers
The group, under the leadership of Alfred and Betty T. Look Endowed Professor Juliet Gopinath of the Department of Electrical, Computer and Energy Engineering modeled the internal loss, external phase noise and inefficiency of a Mach-Zehnder interferometer, but utilized a practical fiber source that created Holland-Burnett entangled states from the two-mode squeezed vacuum. This significantly reduced the limitations of internal loss and phase noise and demonstrated the potential gains of a quantum-based approach to sensitivity.
While the effects of phase noise and optical losses in classical and quantum versions of the sensor were previously modeled, the Gopinath group’s work was unique in that it integrated them into a single model.
“Our findings highlight some subtle points on making a practical sensor using the general technique of entangled photon interferometry,” Krueper said. “We also drew attention to the open and largely unexplored idea of using these sensing methods with optical fiber sensors, which would greatly expand the range of applications for the technique.” Click here to read the complete Phys.Org article.
Marie Baca of Semiconductor Engineering wrote about post-quantum and pre-quantum security issues on November 3. Quantum News briefs summarizes.
Security experts say governments and businesses are starting to prepare for encryption in a post-quantum world. The task is made all the more challenging because no one knows exactly how future quantum machines will work, or even which materials will be used.
The mainstreaming of quantum cryptography is expected to usher in a new age of data security as experts explore quantum key distribution (QKD) and other methods of cryptography based on quantum mechanics.
The flip side of this is certain encryption methods based on classical computing principles will be obsolete in a post-quantum world. That, in turn, will leave countless systems vulnerable to attacks.
But the concerns are more immediate, as well. Experts are preparing for “harvest now, decrypt later” attacks. As the name suggests, HNDL threats involve hackers collecting encrypted data now with the assumption that further developments in quantum computing will allow them to decrypt that information in the future. A recent Deloitte poll found that half of professionals at organizations considering quantum computing benefits believe their organizations are at risk of such attacks.
Many experts agree the solution is to develop quantum-safe encryption methods, but that can be a slow and painful process. The failure of SIKE, one of the post-quantum encryption standards under consideration by NIST, proved both the difficulty of creating such standards and the necessity of doing so through a rigorous process. There are activities organizations can complete now to begin quantum-proofing their data, such as using large keys on symmetric cryptographic algorithms and larger output sizes on hash algorithms. Cryptographic agility in protocols and implementation also will be useful, and hardware acceleration and hardware implementation will be crucial. There are non-cryptographic steps to take, as well, such as encrypting unencrypted data and applying zero trust methods to quantum.
Click here to read Bacas’ original, extensive article.
Sandra K. Helsel, Ph.D. has been researching and reporting on frontier technologies since 1990. She has her Ph.D. from the University of Arizona.