Input phase noise in Gaussian Boson sampling
AI Breakdown
Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.
Abstract
Gaussian boson sampling is an important protocol for testing the performance of photonic quantum simulators. As such, various noise sources have been investigated that degrade the operation of such devices. In this paper, we examine a situation with phase noise between different modes of the input state leading to dephasing of the system. This models the phase fluctuations which remain even when the mean phase is controlled. We aim to determine whether these phase-noisy input states still form a computationally difficult problem. To do this, we use Matrix Product Operators to model the system, a technique recently used to model boson sampling scenarios. Our investigation finds that the Entanglement entropy grows linearly with the number of input states even for noisy input states. This implies that, unlike boson loss, this form of experimentally relevant noise remains difficult to simulate with tensor networks and may allow for the demonstration of quantum advantage without the need for implementing the challenging task of input-state phase stabilisation.