Quantum Brain
← Back to papers

Symmetry-guided gradient descent for quantum neural networks

Ka Bian, Shitao Zhang, Fei Meng, Wen Zhang, Oscar Dahlsten·April 9, 2024·DOI: 10.1103/physreva.110.022406
Physics

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

Many supervised learning tasks have intrinsic symmetries, such as translational and rotational symmetry in image classifications. These symmetries can be exploited to enhance performance. We formulate the symmetry constraints into a concise mathematical form. We design two ways to adopt the constraints into the cost function, thereby shaping the cost landscape in favour of parameter choices which respect the given symmetry. Unlike methods that alter the neural network circuit ansatz to impose symmetry, our method only changes the classical post-processing of gradient descent, which is simpler to implement. We call the method symmetry-guided gradient descent (SGGD). We illustrate SGGD in entanglement classification of Werner states and in a binary classification task in a 2-D feature space. In both cases, the results show that SGGD can accelerate the training, improve the generalization ability, and remove vanishing gradients, especially when the training data is biased.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.