September 11, 2024

This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked
peer-reviewed publication
trusted source
proofread

New classical algorithm enhances understanding of quantum computing's future

Decomposition of lossy GBS circuit. Credit: Nature Physics (2024). DOI: 10.1038/s41567-024-02535-8
× close
Decomposition of lossy GBS circuit. Credit: Nature Physics (2024). DOI: 10.1038/s41567-024-02535-8

In an exciting development for quantum computing, researchers from the University of Chicago's Department of Computer Science, Pritzker School of Molecular Engineering, and Argonne National Laboratory have introduced a classical algorithm that simulates Gaussian boson sampling (GBS) experiments.

This achievement not only helps clarify the complexities of current but also represents a significant step forward in our understanding of how quantum and classical computing can work together. The research appeared in Nature Physics.

The challenge of Gaussian boson sampling

Gaussian boson sampling has gained attention as a promising approach to demonstrating quantum advantage, meaning the ability of quantum computers to perform tasks that classical computers cannot do efficiently. The journey leading up to this breakthrough has been marked by a series of innovative experiments that tested the limits of quantum systems.

Previous studies indicated that GBS is challenging for classical computers to simulate under ideal conditions. However, Assistant Professor and author Bill Fefferman pointed out that the noise and photon loss present in actual experiments create additional challenges that require careful analysis.

Notably, experiments (such as these) conducted by teams at major research centers from the University of Science and Technology of China and Xanadu, a Canadian quantum company, have shown that while quantum devices can produce outputs consistent with GBS predictions, the presence of noise often obscures these results, leading to questions about the claimed quantum advantage. These experiments served as a foundation for the current research, driving scientists to refine their approaches to GBS and better understand its limitations.

Understanding noise in quantum experiments

"While the theoretical groundwork has established that quantum systems can outperform classical ones, the noise present in actual experiments introduces complexities that require rigorous analysis," explained Fefferman. "Understanding how noise affects performance is crucial as we strive for practical applications of quantum computing."

This addresses these complexities by leveraging the high photon loss rates common in current GBS experiments to provide a more efficient and accurate simulation. The researchers employed a classical tensor-network approach that capitalizes on the behavior of quantum states in these noisy environments, making the simulation more efficient and manageable with available computational resources.

Breakthrough results

Remarkably, the researchers found that their classical simulation performed better than some state-of-the-art GBS experiments in various benchmarks.

"What we're seeing is not a failure of quantum computing, but rather an opportunity to refine our understanding of its capabilities," Fefferman emphasized. "It allows us to improve our algorithms and push the boundaries of what we can achieve."

The algorithm outperformed experiments by accurately capturing the ideal distribution of GBS output states, raising questions about the claimed quantum advantage of existing experiments. This insight opens doors for improving the design of future quantum experiments, suggesting that enhancing photon transmission rates and increasing the number of squeezed states could significantly boost their effectiveness.

Implications for future technologies

The implications of these findings extend beyond the realm of quantum computing. As quantum technologies continue to evolve, they hold the potential to revolutionize fields such as cryptography, , and drug discovery. For instance, could lead to breakthroughs in secure communication methods, enabling more robust protection of sensitive data.

In materials science, quantum simulations can help discover new materials with unique properties, paving the way for advancements in technology, energy storage, and manufacturing. By advancing our understanding of these systems, researchers are laying the groundwork for practical applications that could change the way we approach complex problems in various sectors.

The pursuit of is not just an academic endeavor; it has tangible implications for industries that rely on complex computations. As quantum technologies mature, they have the potential to play a crucial role in optimizing supply chains, enhancing artificial intelligence algorithms, and improving climate modeling.

The collaboration between quantum and classical computing is crucial for realizing these advancements, as it allows researchers to harness the strengths of both paradigms.

A cumulative research effort

Fefferman worked closely with Professor Liang Jiang from the Pritzker School of Molecular Engineering and former postdoc Changhun Oh, currently an Assistant Professor at the Korea Advanced Institute of Science and Technology, on previous work that culminated in this piece of research.

In 2021, they examined the computational power of noisy intermediate-scale quantum (NISQ) devices through lossy boson sampling. The paper revealed that photon loss affects classical simulation costs depending on the number of input photons, which could lead to exponential savings in classical time complexity.

Following this, their second paper focused on the impact of noise in experiments designed to demonstrate quantum supremacy, showing that even with significant noise, quantum devices can still produce results that are difficult for classical computers to match. In their third article, they explored Gaussian boson sampling (GBS) by proposing a new architecture that improves programmability and resilience against photon loss, making large scale experiments more feasible.

They then introduced a classical algorithm in their fourth paper that generates outcomes closely aligned with ideal boson sampling, enhancing benchmarking techniques and emphasizing the importance of carefully selecting experiment sizes to preserve the quantum signal amidst noise.

Finally, in their latest study, they developed quantum-inspired classical algorithms to tackle graph-theoretical problems like finding the densest k-subgraph and the maximum weight clique and a quantum chemistry problem called the molecular vibronic spectra generation. Their findings suggested that the claimed advantages of quantum methods may not be as significant as previously thought, with their classical sampler performing similarly to the Gaussian boson sampler.

The development of the classical simulation algorithm not only enhances our understanding of Gaussian boson sampling experiments but also highlights the importance of continued research in both quantum and classical computing. The ability to simulate GBS more effectively serves as a bridge toward more powerful quantum technologies, ultimately helping navigate the complexities of modern challenges.

More information: Changhun Oh et al, Classical algorithm for simulating experimental Gaussian boson sampling, Nature Physics (2024). DOI: 10.1038/s41567-024-02535-8

Journal information: Nature Physics

Load comments (0)