A Grid-Based Quantum Algorithm for the Time-Dependent Simulation of Infrared Spectra
AI Breakdown
Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.
Abstract
We develop a time-dependent, grid-based framework for simulating infrared spectra that is specifically designed for quantum computers. The proposed circuit employs a probabilistic strategy for applying the non-unitary dipole operator and an Split Operator-Quantum Fourier Transform time evolution scheme. Using a vibrational model of the water molecule as a test system, our classical emulation results demonstrate accurate determination of fundamental and overtone band positions and intensities via Fourier-transformed dipole-dipole autocorrelation functions. We also identify the optimal time parameters that minimise gate depths while maintaining high fidelity. For further resource reduction, we validate the feasibility of utilising harmonic oscillator approximations in state preparation and dipole operator truncations. With its scalability to higher-dimensional normal mode spaces, this wavefunction-based approach establishes a robust foundation for studying IR spectra on future quantum hardware.