When Are Nonconvex Problems Not Scary?

10/21/2015
by   Ju Sun, et al.
0

In this note, we focus on smooth nonconvex optimization problems that obey: (1) all local minimizers are also global; and (2) around any saddle point or local maximizer, the objective has a negative directional curvature. Concrete applications such as dictionary learning, generalized phase retrieval, and orthogonal tensor decomposition are known to induce such structures. We describe a second-order trust-region algorithm that provably converges to a global minimizer efficiently, without special initializations. Finally we highlight alternatives, and open problems in this direction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/15/2015

Complete Dictionary Recovery over the Sphere II: Recovery by Riemannian Trust-region Method

We consider the problem of recovering a complete (i.e., square and inver...
research
12/05/2019

Analysis of the Optimization Landscapes for Overcomplete Representation Learning

We study nonconvex optimization landscapes for learning overcomplete rep...
research
02/22/2016

A Geometric Analysis of Phase Retrieval

Can we recover a complex signal from its Fourier magnitudes? More genera...
research
11/11/2015

Complete Dictionary Recovery over the Sphere I: Overview and the Geometric Picture

We consider the problem of recovering a complete (i.e., square and inver...
research
07/14/2020

From Symmetry to Geometry: Tractable Nonconvex Problems

As science and engineering have become increasingly data-driven, the rol...
research
04/26/2015

Complete Dictionary Recovery over the Sphere

We consider the problem of recovering a complete (i.e., square and inver...
research
06/06/2020

SONIA: A Symmetric Blockwise Truncated Optimization Algorithm

This work presents a new algorithm for empirical risk minimization. The ...

Please sign up or login with your details

Forgot password? Click here to reset