Quantum Brain
← Back to papers

Avoiding barren plateaus via Gaussian mixture model

Xiaoxi Shi, Y. Shang·February 21, 2024·DOI: 10.1088/1367-2630/ae0823
Physics

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

Variational quantum algorithms are among the most prominent methods in quantum computing, with applications in quantum machine learning, quantum simulation, and related fields. However, as the number of qubits grows, these algorithms often encounter the barren-plateau phenomenon, which severely limits their scalability. In this work, we introduce a novel parameter-initialization strategy based on Gaussian mixture models. We rigorously prove that for a hardware-efficient ansatz initialized in the |0⟩⊗N state, our scheme avoids barren plateaus regardless of circuit depth, qubit count, or choice of cost function. Specifically, the lower bound on the initial gradient norm provided by our method remains independent of the number of qubits Building on this foundation, we validate our theoretical results through numerical experiments, including variational ground-state searches for Hamiltonians, to demonstrate the practical effectiveness of our approach. Our findings highlight the critical role of Gaussian mixture model-based initialization in enhancing the trainability of quantum circuits and offer valuable guidance for future theoretical and experimental advances in quantum machine learning.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.