Quantum Brain
← Back to papers

Trade-offs between Quantum and Classical Resources in the Linear Combination of Unitaries

Kaito Wada, Hiroyuki Harada, Yasunari Suzuki, Yuuki Tokunaga, Naoki Yamamoto, Suguru Endo·December 6, 2025
Quantum Physics

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

The randomized linear combination of unitaries (LCU) method with many applications to early fault-tolerant quantum computing algorithms has been proposed. This quantum algorithm computes the same expectation values as the original, fully coherent LCU algorithm using a shallower quantum circuit with a single ancilla qubit, at the cost of a quadratically larger sampling overhead. In this work, we propose a quantum algorithm intermediate between the original and randomized LCU that manages the trade-off between the sampling overhead and circuit complexity. Our algorithm divides the set of unitary operators into several groups and then randomly samples LCU circuits from these groups to evaluate the target expectation value. Notably, we reveal that across all grouping strategies, the mechanism of the sampling overhead reduction can be solely characterized by a metric we call the reduction factor. Moreover, we analytically prove an underlying monotonicity of the reduction factor in the group size: larger group sizes entail smaller sampling overhead. Finally, our framework enables a more flexible algorithmic design by systematically yielding intermediate implementations of LCU-based algorithms; we provide intermediate implementations of non-Hermitian dynamics simulation, ground-state property estimation, and quantum error detection. Besides, we demonstrate this principle by deriving intermediate trade-off scaling in sample complexity and ancillary space for quantum linear system solver.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.