A hardware efficient quantum residual neural network without post-selection
AI Breakdown
Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.
Abstract
We propose a hardware efficient quantum residual neural network which implements residual connections through a deterministic linear combination of identity and variational unitaries, enabling fully differentiable training. In contrast to the previous implementation of residual connections, our architecture avoids post-selection while preserving residual learning. Furthermore, we establish trainability of our model, mitigating barren plateaus which are considered as a major limitation of variational quantum learning models. In order to show the working of our model, we report its application to image classification tasks by training it for MNIST, CIFAR, and SARFish datasets, achieving accuracies of 99% and 80% for binary and multi-class classifications, respectively. These accuracies are comparable to previously achieved from the standard variational models, however our model requires 10x fewer gates making it better suited for resource constraint near-term quantum processors. In addition to high accuracies, the proposed architecture also demonstrates adversarial robustness which is another desirable parameter for quantum machine learning models. Overall our architecture offers a new pathway for developing accurate, robust, trainable and hardware efficient quantum machine learning models.