Quantum Brain
← Back to papers

An Improved Quantum Algorithm for Ridge Regression

Chao-Hua Yu, F. Gao, Q. Wen·July 29, 2017·DOI: 10.1109/TKDE.2019.2937491
MathematicsPhysicsComputer Science

AI Breakdown

Get a structured breakdown of this paper — what it's about, the core idea, and key takeaways for the field.

Abstract

Ridge regression (RR) is an important machine learning technique which introduces a regularization hyperparameter <inline-formula><tex-math notation="LaTeX">$\alpha$</tex-math><alternatives><mml:math><mml:mi>α</mml:mi></mml:math><inline-graphic xlink:href="gao-ieq1-2937491.gif"/></alternatives></inline-formula> to ordinary multiple linear regression for analyzing data suffering from multicollinearity. In this paper, we present a quantum algorithm for RR, where the technique of parallel Hamiltonian simulation to simulate a number of Hermitian matrices in parallel is proposed and used to develop a quantum version of <inline-formula><tex-math notation="LaTeX">$K$</tex-math><alternatives><mml:math><mml:mi>K</mml:mi></mml:math><inline-graphic xlink:href="gao-ieq2-2937491.gif"/></alternatives></inline-formula>-fold cross-validation approach, which can efficiently estimate the predictive performance of RR. Our algorithm consists of two phases: (1) using quantum <inline-formula><tex-math notation="LaTeX">$K$</tex-math><alternatives><mml:math><mml:mi>K</mml:mi></mml:math><inline-graphic xlink:href="gao-ieq3-2937491.gif"/></alternatives></inline-formula>-fold cross-validation to efficiently determine a good <inline-formula><tex-math notation="LaTeX">$\alpha$</tex-math><alternatives><mml:math><mml:mi>α</mml:mi></mml:math><inline-graphic xlink:href="gao-ieq4-2937491.gif"/></alternatives></inline-formula> with which RR can achieve good predictive performance, and then (2) generating a quantum state encoding the optimal fitting parameters of RR with such <inline-formula><tex-math notation="LaTeX">$\alpha$</tex-math><alternatives><mml:math><mml:mi>α</mml:mi></mml:math><inline-graphic xlink:href="gao-ieq5-2937491.gif"/></alternatives></inline-formula>, which can be further utilized to predict new data. Since indefinite dense Hamiltonian simulation has been adopted as a key subroutine, our algorithm can efficiently handle non-sparse data matrices. It is shown that our algorithm can achieve exponential speedup over the classical counterpart for (low-rank) data matrices with low condition numbers. But when the condition numbers of data matrices are large to be amenable to full or approximately full ranks of data matrices, only polynomial speedup can be achieved.

Related Research

Quantum Intelligence

Ask about quantum research, companies, or market developments.