First Order Methods take Exponential Time to Converge to Global Minimizers of Non-Convex Functions

02/28/2020
by   Krishna Reddy Kesari, et al.
17

Machine learning algorithms typically perform optimization over a class of non-convex functions. In this work, we provide bounds on the fundamental hardness of identifying the global minimizer of a non convex function. Specifically, we design a family of parametrized non-convex functions and employ statistical lower bounds for parameter estimation. We show that the parameter estimation problem is equivalent to the problem of function identification in the given family. We then claim that non convex optimization is at least as hard as function identification. Jointly, we prove that any first order method can take exponential time to converge to a global minimizer.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/29/2018

Global Non-convex Optimization with Discretized Diffusions

An Euler discretization of the Langevin diffusion is known to converge t...
research
06/14/2013

Relaxed Sparse Eigenvalue Conditions for Sparse Estimation via Non-convex Regularized Regression

Non-convex regularizers usually improve the performance of sparse estima...
research
01/09/2018

Convexification of Neural Graph

Traditionally, most complex intelligence architectures are extremely non...
research
10/26/2022

Sinusoidal Frequency Estimation by Gradient Descent

Sinusoidal parameter estimation is a fundamental task in applications fr...
research
05/09/2012

Modeling Discrete Interventional Data using Directed Cyclic Graphical Models

We outline a representation for discrete multivariate distributions in t...
research
02/13/2018

Fast Global Convergence via Landscape of Empirical Loss

While optimizing convex objective (loss) functions has been a powerhouse...
research
06/26/2023

GloptiNets: Scalable Non-Convex Optimization with Certificates

We present a novel approach to non-convex optimization with certificates...

Please sign up or login with your details

Forgot password? Click here to reset