Variational Quantum Generative Modeling by Sampling Expectation Values of Tunable Observables
AI Breakdown
Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.
Abstract
Expectation Value Samplers (EVSs) are quantum generative models that can learn high-dimensional continuous distributions by measuring the expectation values of parameterized quantum circuits. However, these models can demand impractical quantum resources for good performance. We investigate how observable choices affect EVS performance and propose an Observable-Tunable Expectation Value Sampler (OT-EVS), which achieves greater expressivity than standard EVS. By restricting the selectable observables, it is possible to use the classical shadows measurement scheme to reduce the sample complexity of our algorithm. In addition, we propose an adversarial training method adapted to the needs of OT-EVS. This training prioritizes classical updates of observables, minimizing the more costly updates of quantum circuit parameters. Numerical experiments, using an original simulation technique for correlated shot noise, confirm our model's expressivity and sample efficiency advantages compared to previous designs. We envision our proposal to encourage the exploration of continuous generative models running with few quantum resources.