Exploiting Invariance in Training Deep Neural Networks

03/30/2021
by   Chengxi Ye, et al.
0

Inspired by two basic mechanisms in animal visual systems, we introduce a feature transform technique that imposes invariance properties in the training of deep neural networks. The resulting algorithm requires less parameter tuning, trains well with an initial learning rate 1.0, and easily generalizes to different tasks. We enforce scale invariance with local statistics in the data to align similar samples generated in diverse situations. To accelerate convergence, we enforce a GL(n)-invariance property with global statistics extracted from a batch that the gradient descent solution should remain invariant under basis change. Tested on ImageNet, MS COCO, and Cityscapes datasets, our proposed technique requires fewer iterations to train, surpasses all baselines by a large margin, seamlessly works on both small and large batch size training, and applies to different computer vision tasks of image classification, object detection, and semantic segmentation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/06/2017

AdaBatch: Adaptive Batch Sizes for Training Deep Neural Networks

Training deep neural networks with Stochastic Gradient Descent, or its v...
research
06/14/2016

Neither Quick Nor Proper -- Evaluation of QuickProp for Learning Deep Neural Networks

Neural networks and especially convolutional neural networks are of grea...
research
02/04/2020

Large Batch Training Does Not Need Warmup

Training deep neural networks using a large batch size has shown promisi...
research
01/24/2023

Read the Signs: Towards Invariance to Gradient Descent's Hyperparameter Initialization

We propose ActiveLR, an optimization meta algorithm that localizes the l...
research
10/20/2022

Large-batch Optimization for Dense Visual Predictions

Training a large-scale deep neural network in a large-scale dataset is c...
research
06/15/2020

Spherical Motion Dynamics of Deep Neural Networks with Batch Normalization and Weight Decay

We comprehensively reveal the learning dynamics of deep neural networks ...
research
05/12/2019

Budgeted Training: Rethinking Deep Neural Network Training Under Resource Constraints

In most practical settings and theoretical analysis, one assumes that a ...

Please sign up or login with your details

Forgot password? Click here to reset