Deep Neural Network Training with Frank-Wolfe

10/14/2020
by   Sebastian Pokutta, et al.
0

This paper studies the empirical efficacy and benefits of using projection-free first-order methods in the form of Conditional Gradients, a.k.a. Frank-Wolfe methods, for training Neural Networks with constrained parameters. We draw comparisons both to current state-of-the-art stochastic Gradient Descent methods as well as across different variants of stochastic Conditional Gradients. In particular, we show the general feasibility of training Neural Networks whose parameters are constrained by a convex feasible region using Frank-Wolfe algorithms and compare different stochastic variants. We then show that, by choosing an appropriate region, one can achieve performance exceeding that of unconstrained stochastic Gradient Descent and matching state-of-the-art results relying on L^2-regularization. Lastly, we also demonstrate that, besides impacting performance, the particular choice of constraints can have a drastic impact on the learned representations.

READ FULL TEXT

page 10

page 19

research
12/22/2022

Langevin algorithms for Markovian Neural Networks and Deep Stochastic control

Stochastic Gradient Descent Langevin Dynamics (SGLD) algorithms, which a...
research
06/14/2016

Recurrent neural network training with preconditioned stochastic gradient descent

This paper studies the performance of a recently proposed preconditioned...
research
10/11/2015

Neural Networks with Few Multiplications

For most deep learning algorithms training is notoriously time consuming...
research
03/12/2018

Neural Conditional Gradients

The move from hand-designed to learned optimizers in machine learning ha...
research
10/06/2020

Usable Information and Evolution of Optimal Representations During Training

We introduce a notion of usable information contained in the representat...
research
10/27/2017

Automated Design using Neural Networks and Gradient Descent

We propose a novel method that makes use of deep neural networks and gra...
research
06/20/2021

Better Training using Weight-Constrained Stochastic Dynamics

We employ constraints to control the parameter space of deep neural netw...

Please sign up or login with your details

Forgot password? Click here to reset