The Complexity of Gradient Descent: CLS = PPAD ∩ PLS

11/03/2020
by   John Fearnley, et al.
0

We study search problems that can be solved by performing Gradient Descent on a bounded convex polytopal domain and show that this class is equal to the intersection of two well-known classes: PPAD and PLS. As our main underlying technical contribution, we show that computing a Karush-Kuhn-Tucker (KKT) point of a continuously differentiable function over the domain [0,1]^2 is PPAD ∩ PLS-complete. This is the first natural problem to be shown complete for this class. Our results also imply that the class CLS (Continuous Local Search) - which was defined by Daskalakis and Papadimitriou as a more "natural" counterpart to PPAD ∩ PLS and contains many interesting problems - is itself equal to PPAD ∩ PLS.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/19/2021

Local Convergence of Adaptive Gradient Descent Optimizers

Adaptive Moment Estimation (ADAM) is a very popular training algorithm f...
research
12/31/2019

A frequency-domain analysis of inexact gradient descent

We study robustness properties of inexact gradient descent for strongly ...
research
10/21/2019

Adaptive gradient descent without descent

We present a strikingly simple proof that two rules are sufficient to au...
research
01/19/2020

Dual Stochastic Natural Gradient Descent

Although theoretically appealing, Stochastic Natural Gradient Descent (S...
research
11/04/2018

A Function Fitting Method

In this article we present a function fitting method, which is a convex ...
research
07/03/2018

On the Computational Power of Online Gradient Descent

We prove that the evolution of weight vectors in online gradient descent...
research
02/26/2023

On the Complexity of Recognizing Nerves of Convex Sets

We study the problem of recognizing whether a given abstract simplicial ...

Please sign up or login with your details

Forgot password? Click here to reset