Information-Theoretic Constraints on Variational Quantum Optimization: Efficiency Transitions and the Dynamical Lie Algebra
AI Breakdown
Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.
Abstract
Variational quantum algorithms are leading candidates for near-term advantage, yet their scalability is fundamentally limited by the ``Barren Plateau'' phenomenon. While traditionally attributed to geometric concentration of measure, I propose an information-theoretic origin: a bandwidth bottleneck in the optimization feedback loop. By modeling the optimizer as a coherent Maxwell's Demon, I derive a thermodynamic constitutive relation, $ΔE \leq ηI(S:A)$, where work extraction is strictly bounded by the mutual information established via entanglement. I demonstrate that systems with polynomial Dynamical Lie Algebra (DLA) dimension exhibit ``Information Superconductivity'' (sustained $η> 0$), whereas systems with exponential DLA dimension undergo an efficiency collapse when the rate of information scrambling exceeds the ancilla's channel capacity. These results reframe quantum trainability as a thermodynamic phase transition governed by the stability of information flow.