Quantum Brain
← Back to papers

Stochastic Shadow Descent: Training Parametrized Quantum Circuits with Shadows of Gradients

Sayantan Pramanik, M Girish Chandra·November 15, 2025
Quantum Physics

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

In this paper, we focus on the task of optimizing the parameters in Parametrized Quantum Circuits (PQCs). While popular algorithms, such as Simultaneous Perturbation Stochastic Approximation (SPSA), limit the number of circuit-execution to two per iteration, irrespective of the number of parameters in the circuit, they have their own challenges. These methods use central-differences to calculate biased estimates of directional derivatives. We show, both theoretically and numerically, that this may lead to instabilities in \emph{training} the PQCs. To remedy this, we propose Stochastic Shadow Descent (\texttt{SSD}), which uses random-projections (or \emph{shadows}) of the gradient to update the parameters iteratively. We eliminate the bias in directional derivatives by employing the Parameter-Shift Rule, along with techniques from Quantum Signal Processing, to construct a quantum circuit that parsimoniously computes \emph{unbiased estimates} of directional derivatives. Finally, we prove the convergence of the \texttt{SSD} algorithm, provide worst-case bounds on the number of iterations, and numerically demonstrate its efficacy.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.