Quantum Brain
← Back to papers

Adiabatic training for Variational Quantum Algorithms

Ernesto Acosta, C. C. Gutierrez, Guillermo Botella Juan, Roberto Campos·October 24, 2024·DOI: 10.48550/arXiv.2410.18618
PhysicsComputer Science

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

This paper presents a new hybrid Quantum Machine Learning (QML) model composed of three elements: a classical computer in charge of the data preparation and interpretation; a Gate-based Quantum Computer running the Variational Quantum Algorithm (VQA) representing the Quantum Neural Network (QNN); and an adiabatic Quantum Computer where the optimization function is executed to find the best parameters for the VQA. As of the moment of this writing, the majority of QNNs are being trained using gradient-based classical optimizers having to deal with the barren-plateau effect. Some gradient-free classical approaches such as Evolutionary Algorithms have also been proposed to overcome this effect. To the knowledge of the authors, adiabatic quantum models have not been used to train VQAs. The paper compares the results of gradient-based classical algorithms against adiabatic optimizers showing the feasibility of integration for gate-based and adiabatic quantum computing models, opening the door to modern hybrid QML approaches for High Performance Computing.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.