Quantum Brain
← Back to papers

Classical modelling of a lossy Gaussian bosonic sampler

M. V. Umanskii, A. N. Rubtsov·April 1, 2024
Physics

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

Gaussian boson sampling (GBS) is considered a candidate problem for demonstrating quantum advantage. We propose an algorithm for approximate classical simulation of a lossy GBS instance. The algorithm relies on the Taylor series expansion, and increasing the number of terms of the expansion that are used in the calculation yields greater accuracy. The complexity of the algorithm is polynomial in the number of modes given the number of terms is fixed. We describe conditions for the input state squeezing parameter and loss level that provide the best efficiency for this algorithm (by efficient we mean that the Taylor series converges quickly). In recent experiments that claim to have demonstrated quantum advantage, these conditions are satisfied; thus, this algorithm can be used to classically simulate these experiments.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.