Quantum Brain
← Back to papers

RydbergGPT

David Fitzek, Y. H. Teoh, Hin Pok Fung, Gebremedhin A. Dagnew, Ejaaz Merali, M. Moss, Benjamin MacLellan, R. Melko·May 31, 2024·DOI: 10.1088/2632-2153/ae1d0b
PhysicsComputer Science

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

We introduce a generative pretrained transformer (GPT) designed to learn the measurement outcomes of a neutral atom array quantum computer. Based on a vanilla transformer, our encoder–decoder architecture takes as input the interacting Hamiltonian, and outputs an autoregressive sequence of qubit measurement probabilities. Its performance is studied in the vicinity of a quantum phase transition in Rydberg atoms in a square lattice array. We explore the model’s generalization capabilities by demonstrating that it can accurately predict ground-state measurement outcomes for Hamiltonian parameter values that were not included in the training data. We evaluate three model variants, each trained for a fixed duration on a single NVIDIA A100 GPU, by examining their predictions of key physical observables. These results establish performance benchmarks for scaling to larger RydbergGPT models. These can act as benchmarks for the scaling of larger RydbergGPT models in the future. Finally, we release RydbergGPT as open-source software to facilitate the development of foundation models for diverse quantum computing platforms and datasets.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.