Quantum Brain
← Back to papers

Quantum algorithms for training Gaussian Processes

Zhikuan Zhao, Jack K. Fitzsimons, Michael A. Osborne, S. Roberts, J. Fitzsimons·March 28, 2018·DOI: 10.1103/PhysRevA.100.012304
PhysicsComputer ScienceMathematics

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

Gaussian processes (GPs) are important models in supervised machine learning. Training in Gaussian processes refers to selecting the covariance functions and the associated parameters in order to improve the outcome of predictions, the core of which amounts to evaluating the logarithm of the marginal likelihood ($\mathrm{LML}$) of a given model. The $\mathrm{LML}$ gives a concrete measure of the quality of prediction that a GP model is expected to achieve. The classical computation of the $\mathrm{LML}$ typically carries a polynomial time overhead with respect to the input size. We propose a quantum algorithm that computes the logarithm of the determinant of a Hermitian matrix, which runs in logarithmic time for sparse matrices. This is applied in conjunction with a variant of the quantum linear system algorithm that allows for logarithmic-time computation of the form ${\mathbf{y}}^{T}{A}^{\ensuremath{-}1}\mathbf{y}$, where $\mathbf{y}$ is a dense vector and $A$ is the covariance matrix. We hence show that quantum computing can be used to estimate the $\mathrm{LML}$ of a GP with exponentially improved efficiency under certain conditions.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.