Quantum Brain
← Back to papers

Efficient Calculation of the Maximal Rényi Divergence for a Matrix Product State via Generalized Eigenvalue Density Matrix Renormalization Group

Uri Levin, Noa Feldman, Moshe Goldstein·January 5, 2026
Quantum Physicscond-mat.str-el

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

The study of quantum and classical correlations between subsystems is fundamental to understanding many-body physics. In quantum information theory, the quantum mutual information, $I(A;B)$, is a measure of correlation between the subsystems $A,B$ in a quantum state, and is defined by the means of the von Neumann entropy: $I\left(A;B\right)=S\left(ρ_{A}\right)+S\left(ρ_{B}\right)-S\left(ρ_{AB}\right)$. However, such a computation requires an exponential amount of resources. This is a defining feature of quantum systems, the infamous ``curse of dimensionality'' . Other measures, which are based on Rényi divergences instead of von Neumann entropy, were suggested as alternatives in a recent paper showing them to possess important theoretical features, and making them leading candidates as mutual information measures. In this work, we concentrate on the maximal Rényi divergence. This measure can be shown to be the solution of a generalized eigenvalue problem. To calculate it efficiently for a 1D state represented as a matrix product state, we develop a generalized eigenvalue version of the density matrix renormalization group algorithm. We benchmark our method for the paradigmatic XXZ chain, and show that the maximal Rényi divergence may exhibit different trends than the von Neumann mutual information.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.