Quantum Brain
← Back to papers

Toward bootstrapping tensor-network contractions

Seishiro Ono, Yanbai Zhang, Hoi Chun Po·March 18, 2026
cond-mat.str-elQuantum Physics

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

Accurate contraction of tensor networks beyond one dimension is essential in various fields including quantum many-body physics. Existing approaches typically rely on approximate contraction schemes and do not provide certified error bars. We introduce a numerical bootstrap framework which casts the problem of tensor-network contractions into a convex optimization problem, thereby yielding certified lower and upper bounds on expectation values of physical observables. As a proof-of-principle, we construct such constraints explicitly for translationally invariant matrix product states and demonstrate that, assuming a canonical form, second-order-cone relaxation can provide tight bounds on the contraction result. We further demonstrate that when the requirement on canonical form is lifted, a more general semidefinite-programming approach could yield similar tight bounds at higher but still polynomial computational cost. Our work suggests numerical bootstrap could be a possible way forward for the rigorous contractions of tensor networks.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.