On The Convergence of First Order Methods for Quasar-Convex Optimization

10/10/2020
by   Jikai Jin, et al.
0

In recent years, the success of deep learning has inspired many researchers to study the optimization of general smooth non-convex functions. However, recent works have established pessimistic worst-case complexities for this class functions, which is in stark contrast with their superior performance in real-world applications (e.g. training deep neural networks). On the other hand, it is found that many popular non-convex optimization problems enjoy certain structured properties which bear some similarities to convexity. In this paper, we study the class of quasar-convex functions to close the gap between theory and practice. We study the convergence of first order methods in a variety of different settings and under different optimality criterions. We prove complexity upper bounds that are similar to standard results established for convex functions and much better that state-of-the-art convergence rates of non-convex functions. Overall, this paper suggests that quasar-convexity allows efficient optimization procedures, and we are looking forward to seeing more problems that demonstrate similar properties in practice.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/11/2020

Recent Theoretical Advances in Non-Convex Optimization

Motivated by recent increased interest in optimization algorithms for no...
research
02/19/2016

First-order Methods for Geodesically Convex Optimization

Geodesic convexity generalizes the notion of (vector space) convexity to...
research
06/17/2018

Geodesic Convex Optimization: Differentiation on Manifolds, Geodesics, and Convexity

Convex optimization is a vibrant and successful area due to the existenc...
research
12/16/2013

Probable convexity and its application to Correlated Topic Models

Non-convex optimization problems often arise from probabilistic modeling...
research
06/16/2021

DeepSplit: Scalable Verification of Deep Neural Networks via Operator Splitting

Analyzing the worst-case performance of deep neural networks against inp...
research
05/11/2021

Frank-Wolfe Methods in Probability Space

We introduce a new class of Frank-Wolfe algorithms for minimizing differ...
research
02/22/2023

Stress and Adaptation: Applying Anna Karenina Principle in Deep Learning for Image Classification

Image classification with deep neural networks has reached state-of-art ...

Please sign up or login with your details

Forgot password? Click here to reset