Quantum Brain
← Back to papers

A quantum algorithm to train neural networks using low-depth circuits

Guillaume Verdon, M. Broughton, J. Biamonte·December 14, 2017
Computer SciencePhysics

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

The question has remained open if near-term gate model quantum computers will offer a quantum advantage for practical applications in the pre-fault tolerance noise regime. A class of algorithms which have shown some promise in this regard are the so-called classical-quantum hybrid variational algorithms. Here we develop a low-depth quantum algorithm to train quantum Boltzmann machine neural networks using such variational methods. We introduce a method which employs the quantum approximate optimization algorithm as a subroutine in order to approximately sample from Gibbs states of Ising Hamiltonians. We use this approximate Gibbs sampling to train neural networks for which we demonstrate training convergence for numerically simulated noisy circuits with depolarizing errors of rates of up to 4%.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.