Quantum Estimation of Delay Tail Probabilities in Scheduling and Load Balancing
AI Breakdown
Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.
Abstract
Estimating delay tail probabilities in scheduling and load balancing systems is a critical but computationally prohibitive task due to the rarity of violation events. Quantum Amplitude Estimation (QAE) offers a generic quadratic reduction in sample complexity 1/sqrt(p) vs 1/p, but applying it to steady-state queueing networks in challenging: classical simulations involve unbounded state spaces and random regeneration cycles, whereas quantum circuits have fixed depth and finite registers. In this paper, we develop a framework for quantum simulation of delay tail probabilities based on truncated regenerative simulation. We show that regenerative rare-event estimators can be reformulated as deterministic, reversible functions of finite random seeds by truncating regeneration cycles. To control the resulting bias, we use Lyapunov drift and concentration arguments to derive exponential tail bounds on regeneration times. This allows the truncation horizon--and hence the quantum circuit depth--to be chosen such that the bias is provably negligible compared to the statistical error. The proposed framework enables quantum estimation in models with countably infinite state spaces, avoiding the challenge of determining the sufficient mixing time required for direct finite-horizon simulation. We provide bounds on qubit and circuit complexity for a GI-GI-1 queue, a wireless network under MaxWeight scheduling, and a multi-server system with Join-the-Shortest-Queue (JSQ) routing.