(Nearly) Optimal Time-dependent Hamiltonian Simulation
AI Breakdown
Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.
Abstract
We describe a simple quantum algorithm to simulate time-dependent Hamiltonian, extending the methodology of quantum signal processing. The framework achieves optimal scaling up to some factor with respect to other parameters, and nearly optimal in inverse of error tolerance, which could be improved to optimal scaling under certain input models. As applications, we discuss the problem of simulating generalized lattice system and time-periodic, or Floquet system, showing that our framework provides a neater yet highly efficient solution, achieving optimal/nearly optimal scaling in all parameters. In particular, our method also paves a new way for studying phase transition on quantum computer, extending the reach of quantum simulation.