Quantum gradient descent and Newton's method for constrained polynomial optimization

Optimization problems in disciplines such as machine learning are commonly solved with iterative methods. Gradient descent algorithms find local minima by moving along the direction of steepest descent while Newton's method takes into account curvature information and thereby often improves con...

Full description

Bibliographic Details
Main Authors: Rebentrost, Frank Patrick (Author), Lloyd, Seth (Author)
Other Authors: Massachusetts Institute of Technology. Research Laboratory of Electronics (Contributor), Massachusetts Institute of Technology. Department of Mechanical Engineering (Contributor)
Format: Article
Language:English
Published: IOP Publishing, 2020-05-27T14:00:34Z.
Subjects:
Online Access:Get fulltext
Description
Summary:Optimization problems in disciplines such as machine learning are commonly solved with iterative methods. Gradient descent algorithms find local minima by moving along the direction of steepest descent while Newton's method takes into account curvature information and thereby often improves convergence. Here, we develop quantum versions of these iterative optimization algorithms and apply them to polynomial optimization with a unit norm constraint. In each step, multiple copies of the current candidate are used to improve the candidate using quantum phase estimation, an adapted quantum state exponentiation scheme, as well as quantum matrix multiplications and inversions. The required operations perform polylogarithmically in the dimension of the solution vector and exponentially in the number of iterations. Therefore, the quantum algorithm can be useful for high-dimensional problems where a small number of iterations is sufficient.