Neural network backflow for ab-initio solid calculations
AI Breakdown
Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.
Abstract
Accurately simulating extended periodic systems is a central challenge in condensed matter physics. Neural quantum states (NQS) offer expressive wavefunctions for this task but face issues with scalability. In this work, we successfully extend the neural network backflow (NNBF) approach to ab-initio solid-state materials. Building on our scalable optimization framework for molecules [Liu et al., PRB 112, 155162 (2025)], we introduce a two-stage pruning strategy to manage the massive configuration space expansions: by utilizing a computationally cheap, physics-informed importance proxy, we devote exact NNBF amplitude evaluations solely to the most relevant determinants, significantly improving optimization efficiency, energy estimation, and convergence. Our framework achieves state-of-the-art accuracy across diverse solid-state benchmarks. For 1D hydrogen chains, NNBF matches or surpasses DMRG and AFQMC, remains robust in strongly correlated bond-breaking regimes where coupled-cluster methods fail, and smoothly extrapolates to the thermodynamic limit. We further demonstrate its scalability by computing ground-state potential energy curves for 2D graphene and 3D silicon. Finally, ablation studies validate the computational savings of our pruning strategy and highlight the dependence of the NNBF energies on basis sets.