The local convexity of solving systems of quadratic equations

06/25/2015
by   Chris D. White, et al.
0

This paper considers the recovery of a rank r positive semidefinite matrix X X^T∈R^n× n from m scalar measurements of the form y_i := a_i^T X X^T a_i (i.e., quadratic measurements of X). Such problems arise in a variety of applications, including covariance sketching of high-dimensional data streams, quadratic regression, quantum state tomography, among others. A natural approach to this problem is to minimize the loss function f(U) = ∑_i (y_i - a_i^TUU^Ta_i)^2 which has an entire manifold of solutions given by {XO}_O∈O_r where O_r is the orthogonal group of r× r orthogonal matrices; this is non-convex in the n× r matrix U, but methods like gradient descent are simple and easy to implement (as compared to semidefinite relaxation approaches). In this paper we show that once we have m ≥ C nr ^2(n) samples from isotropic gaussian a_i, with high probability (a) this function admits a dimension-independent region of local strong convexity on lines perpendicular to the solution manifold, and (b) with an additional polynomial factor of r samples, a simple spectral initialization will land within the region of convexity with high probability. Together, this implies that gradient descent with initialization (but no re-sampling) will converge linearly to the correct X, up to an orthogonal transformation. We believe that this general technique (local convexity reachable by spectral initialization) should prove applicable to a broader class of nonconvex optimization problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/04/2018

Solving Systems of Quadratic Equations via Exponential-type Gradient Descent Algorithm

We consider the rank minimization problem from quadratic measurements, i...
research
02/14/2019

Solving Complex Quadratic Systems with Full-Rank Random Matrices

We tackle the problem of recovering a complex signal x∈C^n from quadrat...
research
02/17/2018

Nonconvex Matrix Factorization from Rank-One Measurements

We consider the problem of recovering low-rank matrices from random rank...
research
06/19/2015

A Convergent Gradient Descent Algorithm for Rank Minimization and Semidefinite Programming from Random Linear Measurements

We propose a simple, scalable, and fast gradient descent algorithm to op...
research
11/25/2019

Projective Quadratic Regression for Online Learning

This paper considers online convex optimization (OCO) problems - the par...
research
09/10/2015

Fast low-rank estimation by projected gradient descent: General statistical and algorithmic guarantees

Optimization problems with rank constraints arise in many applications, ...
research
01/04/2023

Quantum relaxation for quadratic programs over orthogonal matrices

Quadratic programming over the (special) orthogonal group encompasses a ...

Please sign up or login with your details

Forgot password? Click here to reset