On the Computational Power of Online Gradient Descent

07/03/2018
by   Vaggos Chatziafratis, et al.
0

We prove that the evolution of weight vectors in online gradient descent can encode arbitrary polynomial-space computations, even in the special case of soft-margin support vector machines. Our results imply that, under weak complexity-theoretic assumptions, it is impossible to reason efficiently about the fine-grained behavior of online gradient descent.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/06/2017

Adequacy of the Gradient-Descent Method for Classifier Evasion Attacks

Despite the wide use of machine learning in adversarial settings includi...
research
06/01/2018

Implicit Bias of Gradient Descent on Linear Convolutional Networks

We show that gradient descent on full-width linear convolutional network...
research
08/14/2018

Discrete gradient descent differs qualitatively from gradient flow

We consider gradient descent on functions of the form L_1 = |f| and L_2 ...
research
10/01/2020

Agnostic Learning of Halfspaces with Gradient Descent via Soft Margins

We analyze the properties of gradient descent on convex surrogates for t...
research
06/26/2018

Multi-Merge Budget Maintenance for Stochastic Gradient Descent SVM Training

Budgeted Stochastic Gradient Descent (BSGD) is a state-of-the-art techni...
research
11/03/2020

The Complexity of Gradient Descent: CLS = PPAD ∩ PLS

We study search problems that can be solved by performing Gradient Desce...
research
02/21/2018

The Many Faces of Exponential Weights in Online Learning

A standard introduction to online learning might place Online Gradient D...

Please sign up or login with your details

Forgot password? Click here to reset