A study by more than a dozen scientists at the Department of Energy’s Oak Ridge National Laboratory examines potential strategies for integrating quantum computing with the world’s most powerful supercomputing systems in the pursuit of science.
The study published in Next Generation Computer Systems takes a comprehensive look at the state of quantum computing and classical high-performance computing, or HPC, and describes a potential framework for boosting traditional scientific HPC by leveraging the quantum approach.
“It’s kind of a manifesto for how we propose to dive as a lab into this new era of computing,” said co-author Rafael Ferreira da Silva, a principal investigator at ORNL’s National Center for Computing Sciences, or NCCS. “Our approach won’t be the only right one, but we think it will be useful and build on ORNL’s legacy as a leader in supercomputing and we can adapt it as the technology evolves and the next generation of computing takes shape.”
ORNL is home to the Oak Ridge Leadership Computing Facility, or OLCF, which houses Frontier, the world’s fastest supercomputer, and the OLCF Quantum Computing User Program, which provides user time on private quantum processors across the country to support independent quantum studies. The lab also leads DOE’s Quantum Science Center, a national research center for quantum information science, which combines the resources and expertise of national laboratories, universities, and industry partners to study quantum computing, quantum sensing, and quantum materials.
“We have a wealth of experience here at ORNL building classical supercomputers, going back more than 20 years,” said Tom Beck, the study’s lead author, who oversees the NCCS’s science engagement section. “How can we apply that experience and maintain that momentum as we explore this new quantum field?”
Classical computers store information as bits that are either 0 or 1. In other words, a classical bit, like a switch, exists in one of two states: on or off. This binary dynamic does not necessarily scale well to some complex scientific problems.
“We have some scientific problems where electrons, for example, are coupled between atoms in an exponential way when we try to model them on a classical computer,” Beck said. “We can adjust the formulas and try to solve these problems in an abbreviated way, but we can’t even hope to solve them on a classical computer. The equations and calculations required are just too complex.”
Quantum computing uses the laws of quantum mechanics to store information in qubits, the quantum equivalent of bits. Qubits can exist in multiple states simultaneously through quantum superposition, allowing them to carry more information than classical bits.
Quantum superposition allows a qubit to exist in two possible states at the same time, like a coin flipping: neither heads nor tails for the coin, nor one value nor the other for the qubit. Measuring the value of the qubit determines the probability of measuring one of the two possible values, in the same way that you stop the coin on heads or tails. This dynamic allows for a wider range of possible values, more like a dial with fine adjustments than a binary on-off switch.
“The quantum aspect allows us to represent the problem more efficiently and potentially opens up a new way to solve problems that we couldn’t solve before,” Beck said.
Scientists have yet to find the most efficient technology for encoding qubits, and high error rates remain an obstacle to exploiting the potential of quantum computing. The study proposes developing quantum testbeds to explore the different technologies and coupling these testbeds with classical machines.
“We don’t want to tie ourselves to any particular technology right now because we don’t know which approach is going to be best,” Beck said. “But while we’re at this early stage, we need to start integrating quantum elements into our computing infrastructure with an eye toward potential breakthroughs.”
“Eventually, we want to connect these two very different types of computers seamlessly to run the machines together, similar to the hybrid architecture of graphics processing units, or GPUs, and central processing units, or CPUs, that accelerates today’s leading-edge supercomputers.”
This hybrid architecture, used by supercomputers like Frontier, integrates both types of processors on each node for the fastest possible computation: GPUs for the repetitive computations that are the backbone of most simulations, and CPUs for higher-level tasks like retrieving information and executing other instructions. The technology needed for classical and quantum processors to share space on a node does not yet exist.
The study recommends high-speed networking as the best way to connect classical HPC resources to quantum computers for now.
“There are degrees of integration, and we’re not going to get to the ideal right away,” said ORNL’s Sarp Oral, who oversees the NCCS’s Advanced Technologies Section. “To get to that ideal, we need to identify algorithms and applications that can benefit from quantum computing. Our job is to provide better ways to do science, and quantum computing can be a tool that meets that goal.”
More information:
Thomas Beck et al., Integrating Quantum Computing Resources into Scientific HPC Ecosystems, Next Generation Computer Systems (2024). DOI: 10.1016/j.future.2024.06.058
Provided by Oak Ridge National Laboratory
Quote: Study aims to unite high-performance computing and quantum computing for science (2024, August 28) retrieved August 28, 2024 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without written permission. The content is provided for informational purposes only.