Quantum Brain
← Back to papers

SA-DQAS: Self-attention Enhanced Differentiable Quantum Architecture Search

Yize Sun, Jiarui Liu, Zixin Wu, Volker Tresp, Yunpu Ma·June 13, 2024
Physics

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

We introduce SA-DQAS, a novel framework that enhances Differentiable Quantum Architecture Search (DQAS) by integrating a self-attention mechanism, enabling more effective quantum circuit design for variational quantum algorithms. Unlike DQAS, which treats placeholders independently, SA-DQAS captures inter-placeholder dependencies to improve architecture learning. We evaluate SA-DQAS across multiple tasks, including MaxCut, Job-Shop Scheduling Problem (JSSP), quantum chemistry simulation, and error mitigation. Experimental results show that SA-DQAS outperforms baselines and prior QAS methods in most cases, producing architectures with better stability, convergence, and noise resilience. To assess scalability and hardware readiness, we further test SA-DQAS-generated circuits on IBM's quantum device using the MaxCut problem. Circuits trained on small graphs are stacked to solve larger instances without retraining, demonstrating generalization to real hardware and larger problem sizes. Our results suggest that SA-DQAS not only improves circuit quality during training but also enables practical deployment on near-term quantum devices. This research represents the first successful integration of self-attention mechanism with DQAS.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.