Quantum Brain
← Back to papers

Generative modeling with Gaussian Boson Sampling: classically trainable Bosonic Born Machines

Zoltán Kolarovszki, Bence Bakó, Michał Oszmaniec, Changhun Oh, Zoltán Zimborás·March 11, 2026
Quantum Physics

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

Quantum generative modeling has emerged as a promising application of quantum computers, aiming to model complex probability distributions beyond the reach of classical methods. In practice, however, training such models often requires costly gradient estimation performed directly on the quantum hardware. Crucially, for certain structured quantum circuits, expectation values of local observables can be efficiently evaluated on a classical computer, enabling classical training without calls to the quantum hardware in the optimization loop. In these models, sampling from the resulting circuits can still be classically hard, so inference must be performed on a quantum device, yielding a potential computational advantage. In this work, we introduce a photonic quantum generative model built on parametrized Gaussian Boson Sampling circuits. The training is based on the efficient classical evaluation of expectation values enabled by the Gaussian structure of the state, allowing scalable optimization of the model parameters through the maximum mean discrepancy loss function. We demonstrate the effectiveness of the approach through numerical experiments on photonic systems with up to 805 modes and over a million trainable parameters, highlighting its scalability and suitability for near-term photonic quantum devices.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.