Quantum Brain
← Back to papers

Convex semidefinite tensor optimization and quantum entanglement

Liding Xu, Ye-Chao Liu, Sebastian Pokutta·November 7, 2025
math.OCQuantum Physics

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

The cone of positive-semidefinite (PSD) matrices is fundamental in convex optimization, and we extend this notion to tensors, defining PSD tensors, which correspond to separable quantum states. We study the convex optimization problem over the PSD tensor cone. While this convex cone admits a smooth reparameterization through tensor factorizations (analogous to the matrix case), it is not self-dual. Moreover, there are currently no efficient algorithms for projecting onto or testing membership in this cone, and the semidefinite tensor optimization problem, although convex, is NP-hard. To address these challenges, we develop methods for computing lower and upper bounds on the optimal value of the problem. We propose a general-purpose iterative refinement algorithm that combines a lifted alternating direction method of multipliers with a cutting-plane approach. This algorithm exploits PSD tensor factorizations to produce heuristic solutions and refine the solutions using cutting planes. Since the method requires a linear minimization oracle over PSD tensors, we design a spatial branch-and-bound algorithm based on convex relaxations and valid inequalities. Our framework allows us to study the white-noise mixing threshold, which characterizes the entanglement properties of quantum states. Numerical experiments on benchmark instances demonstrate the effectiveness of the proposed methods.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.