Homotopy Analysis for Tensor PCA

10/28/2016
by   Anima Anandkumar, et al.
0

Developing efficient and guaranteed nonconvex algorithms has been an important challenge in modern machine learning. Algorithms with good empirical performance such as stochastic gradient descent often lack theoretical guarantees. In this paper, we analyze the class of homotopy or continuation methods for global optimization of nonconvex functions. These methods start from an objective function that is efficient to optimize (e.g. convex), and progressively modify it to obtain the required objective, and the solutions are passed along the homotopy path. For the challenging problem of tensor PCA, we prove global convergence of the homotopy method in the "high noise" regime. The signal-to-noise requirement for our algorithm is tight in the sense that it matches the recovery guarantee for the best degree-4 sum-of-squares algorithm. In addition, we prove a phase transition along the homotopy path for tensor PCA. This allows to simplify the homotopy method to a local search algorithm, viz., tensor power iterations, with a specific initialization and a noise injection procedure, while retaining the theoretical guarantees.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/07/2022

Local and Global Convergence of General Burer-Monteiro Tensor Optimizations

Tensor optimization is crucial to massive machine learning and signal pr...
research
12/23/2021

Selective Multiple Power Iteration: from Tensor PCA to gradient-based exploration of landscapes

We propose Selective Multiple Power Iterations (SMPI), a new algorithm t...
research
06/29/2020

Optimization Landscape of Tucker Decomposition

Tucker decomposition is a popular technique for many data analysis and m...
research
06/05/2018

AdaGrad stepsizes: Sharp convergence over nonconvex landscapes, from any initialization

Adaptive gradient methods such as AdaGrad and its variants update the st...
research
05/29/2019

How to iron out rough landscapes and get optimal performances: Replicated Gradient Descent and its application to tensor PCA

In many high-dimensional estimation problems the main task consists in m...
research
11/14/2022

Higher degree sum-of-squares relaxations robust against oblivious outliers

We consider estimation models of the form Y=X^*+N, where X^* is some m-d...
research
11/07/2022

Lower Bounds for the Convergence of Tensor Power Iteration on Random Overcomplete Models

Tensor decomposition serves as a powerful primitive in statistics and ma...

Please sign up or login with your details

Forgot password? Click here to reset