Nonlinear Acceleration of CNNs

06/01/2018
by   Damien Scieur, et al.
2

The Regularized Nonlinear Acceleration (RNA) algorithm is an acceleration method capable of improving the rate of convergence of many optimization schemes such as gradient descend, SAGA or SVRG. Until now, its analysis is limited to convex problems, but empirical observations shows that RNA may be extended to wider settings. In this paper, we investigate further the benefits of RNA when applied to neural networks, in particular for the task of image recognition on CIFAR10 and ImageNet. With very few modifications of exiting frameworks, RNA improves slightly the optimization process of CNNs, after training.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/24/2018

Nonlinear Acceleration of Deep Neural Networks

Regularized nonlinear acceleration (RNA) is a generic extrapolation sche...
research
02/19/2018

On the Optimization of Deep Networks: Implicit Acceleration by Overparameterization

Conventional wisdom in deep learning states that increasing depth improv...
research
07/11/2020

Shanks and Anderson-type acceleration techniques for systems of nonlinear equations

This paper examines a number of extrapolation and acceleration methods, ...
research
11/05/2020

Accelerated Additive Schwarz Methods for Convex Optimization with Adaptive Restart

Based on an observation that additive Schwarz methods for general convex...
research
05/28/2019

Direct Nonlinear Acceleration

Optimization acceleration techniques such as momentum play a key role in...
research
05/17/2018

Interpolatron: Interpolation or Extrapolation Schemes to Accelerate Optimization for Deep Neural Networks

In this paper we explore acceleration techniques for large scale nonconv...
research
06/02/2023

MutateNN: Mutation Testing of Image Recognition Models Deployed on Hardware Accelerators

With the research advancement of Artificial Intelligence in the last yea...

Please sign up or login with your details

Forgot password? Click here to reset