Evolving Neural Selection with Adaptive Regularization

04/04/2022
by   Li Ding, et al.
0

Over-parameterization is one of the inherent characteristics of modern deep neural networks, which can often be overcome by leveraging regularization methods, such as Dropout. Usually, these methods are applied globally and all the input cases are treated equally. However, given the natural variation of the input space for real-world tasks such as image recognition and natural language understanding, it is unlikely that a fixed regularization pattern will have the same effectiveness for all the input cases. In this work, we demonstrate a method in which the selection of neurons in deep neural networks evolves, adapting to the difficulty of prediction. We propose the Adaptive Neural Selection (ANS) framework, which evolves to weigh neurons in a layer to form network variants that are suitable to handle different input cases. Experimental results show that the proposed method can significantly improve the performance of commonly-used neural network architectures on standard image recognition benchmarks. Ablation studies also validate the effectiveness and contribution of each component in the proposed framework.

READ FULL TEXT
research
09/26/2019

Convolutional Neural Networks with Dynamic Regularization

Regularization is commonly used in machine learning for alleviating over...
research
11/20/2018

Gradient-Coherent Strong Regularization for Deep Neural Networks

Deep neural networks are often prone to over-fitting with their numerous...
research
08/09/2017

Probabilistic Neural Network with Complex Exponential Activation Functions in Image Recognition using Deep Learning Framework

If the training dataset is not very large, image recognition is usually ...
research
10/09/2020

A Novel ANN Structure for Image Recognition

The paper presents Multi-layer Auto Resonance Networks (ARN), a new neur...
research
01/05/2021

AutoDropout: Learning Dropout Patterns to Regularize Deep Networks

Neural networks are often over-parameterized and hence benefit from aggr...
research
12/11/2019

An Improving Framework of regularization for Network Compression

Deep Neural Networks have achieved remarkable success relying on the dev...
research
06/12/2019

Compressive Hyperspherical Energy Minimization

Recent work on minimum hyperspherical energy (MHE) has demonstrated its ...

Please sign up or login with your details

Forgot password? Click here to reset