Connects decision-makers and solutions creators to what's next in quantum computing

DARPA Selects Riverlane for Quantum Benchmarking

Program aims to determine metrics for how quantum computers can solve real problems

Berenice Baker, Editor, Enter Quantum

April 23, 2024

2 Min Read

The Defense Advanced Research Projects Agency (DARPA) has selected Riverlane for Phase 2 of the quantum benchmarking program, which aims to develop key metrics for quantum computers solving practical problems.

Riverlane is working with researchers at the University of Southern California and the University of Sydney and national laboratories including Los Alamos National Laboratory to identify these potential benchmarks. The initial focus will be on the fields of plasma physics, fluid dynamics, condensed matter and high-energy physics.

The team is also building tools to estimate the quantum and classical resources needed to implement quantum algorithms to solve the benchmark problems at scale.

“Riverlane’s mission is to make quantum computing useful sooner, starting an era of human progress as significant as the industrial and digital revolutions,” said Riverlane CEO and founder Steve Brierley.

“The DARPA quantum benchmarking program aligns with this goal, helping the quantum community measure progress and maintain momentum as we unlock quantum error correction and enable fault tolerance.”

As quantum computing shifts gears from pure research to genuine industry use cases that classical computers alone cannot process, fault tolerance becomes more important. In practical terms, this means putting in place error correction techniques that account for the errors that qubits are prone to. These are caused by factors that include environmental “noise.”

Related:Generative AI Gets a Quantum Boost from Pasqal, Mila Partnership

Solving error correction reliably is considered vital to scaling computers to the point at which they have enough logical, or error-corrected, qubits to reach quantum advantage. This means quantum computers can solve problems that classical computers alone cannot.

“Fault tolerance will result in significant overheads, both in terms of qubit count and calculation time and it is important to take this into consideration when comparing to classical techniques,” explained Riverlane principal quantum scientist Hari Krovi.

“It has been known for some time that mild speed-ups such as a quadratic speed-up can disappear when the fault tolerance overhead is considered. There are many different approaches to fault tolerance to consider and each one leads to overheads that can vary by many orders of magnitude. The choice of the quantum code to help identify and correct errors in the system can lead to different overheads.” 

About the Author(s)

Berenice Baker

Editor, Enter Quantum

Berenice is the editor of Enter Quantum, the companion website and exclusive content outlet for The Quantum Computing Summit. Enter Quantum informs quantum computing decision-makers and solutions creators with timely information, business applications and best practice to enable them to adopt the most effective quantum computing solution for their businesses. Berenice has a background in IT and 16 years’ experience as a technology journalist.

Sign Up for the Newsletter
The most up-to-date news and insights into the latest emerging technologies ... delivered right to your inbox!

You May Also Like