Quantum Brain
← Back to papers

Attention in Krylov Space

Zihao Qi, Christopher Earls·January 12, 2026
Quantum Physicscond-mat.stat-mech

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

The Universal Operator Growth Hypothesis formulates time evolution of operators through Lanczos coefficients. In practice, however, numerical instability and memory cost limit the number of coefficients that can be computed exactly. In response to these challenges, the standard approach relies on fitting early coefficients to asymptotic forms, but such procedures can miss subleading, history-dependent structures in the coefficients that subsequently affect reconstructed observables. In this work, we treat the Lanczos coefficients as a causal time sequence and introduce a transformer-based model to autoregressively predict future Lanczos coefficients from short prefixes. For both classical and quantum systems, our machine-learning model outperforms asymptotic fits, in both coefficient extrapolation and physical observable reconstruction, by achieving an order-of-magnitude reduction in error. Our model also transfers across system sizes: it can be trained on smaller systems and then be used to extrapolate coefficients on a larger system without retraining. By probing the learned attention patterns and performing targeted attention ablations, we identify which portions of the coefficient history are most influential for accurate forecasts.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.