Quantum Brain
← Back to papers

Improved Lower Bounds for Learning Quantum Channels in Diamond Distance

Aadil Oufkir, Filippo Girardi·January 7, 2026
Quantum PhysicsMathematical Physics

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

We prove that learning an unknown quantum channel with input dimension $d_A$, output dimension $d_B$, and Choi rank $r$ to diamond distance $\varepsilon$ requires $ Ω\!\left( \frac{d_A d_B r}{\varepsilon \log(d_B r / \varepsilon)} \right)$ channel queries when $d_A= rd_B$, and $Ω\!\left( \frac{d_A d_B r}{\varepsilon^2 \log(d_B r / \varepsilon)} \right)$ channel queries when $d_A\le rd_B/2$. These lower bounds improve upon the best previous $Ω(d_A d_B r)$ bound by introducing explicit, near-optimal $\varepsilon$-dependence. Moreover, when $d_A\le rd_B/2$, the lower bound is optimal up to a logarithmic factor. The proof constructs ensembles of channels that are well separated in diamond norm yet admit Stinespring isometries that are close in operator norm.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.