Quantum Brain
← Back to papers

Gradient Estimation with Constant Scaling for Hybrid Quantum Machine Learning

Thomas Hoffmann, Douglas Brown·November 25, 2022
Physics

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

We present a novel method for determining gradients of parameterised quantum circuits (PQCs) in hybrid quantum-classical machine learning models by applying the multivariate version of the simultaneous perturbation stochastic approximation (SPSA) algorithm. The gradients of PQC layers can be calculated with an overhead of two evaluations per circuit per forward-pass independent of the number of circuit parameters, compared to the linear scaling of the parameter shift rule. These are then used in the backpropagation algorithm by applying the chain rule. We compare our method to the parameter shift rule for different circuit widths and batch sizes, and for a range of learning rates. We find that, as the number of qubits increases, our method converges significantly faster than the parameter shift rule and to a comparable accuracy, even when considering the optimal learning rate for each method.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.