DysonNet: Constant-Time Local Updates for Neural Quantum States
AI Breakdown
Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.
Abstract
Neural quantum states (NQS) provide a flexible variational framework for many-body wavefunctions, but suffer from high computational cost and limited interpretability. We introduce DysonNet, a broad class of NQS that couples strictly local nonlinearities through global linear layers. This structure is analogous to a truncated Dyson series which gives an intuitive interpretation of local wavefunction updates as scattering from static impurities. By resumming the scattering series, single-spin-flip updates can be computed in $\mathcal{O}(1)$ time, independent of system size, using an algorithm we call ABACUS. Implementing DysonNet with the state-space model S4, we obtain up to $230\times$ speedups over Vision-Transformers for computing the local estimator. This corresponds to an asymptotic $\mathcal{O}(N^2)$ improvement in training-time scaling, reaching $\mathcal{O}(N \log^2 N)$ total training complexity in area-law phases. Benchmarks on the 1D long-range Ising model and frustrated $J_1$-$J_2$ chains show that DysonNet matches state-of-the-art NQS accuracy while removing the dominant local-update overhead. More broadly, our results suggest a route to scalable NQS architectures where physical interpretability directly enables computational efficiency.