Post-synaptic potential regularization has potential

07/19/2019
by   Enzo Tartaglione, et al.
0

Improving generalization is one of the main challenges for training deep neural networks on classification tasks. In particular, a number of techniques have been proposed, aiming to boost the performance on unseen data: from standard data augmentation techniques to the ℓ_2 regularization, dropout, batch normalization, entropy-driven SGD and many more. In this work we propose an elegant, simple and principled approach: post-synaptic potential regularization (PSP). We tested this regularization on a number of different state-of-the-art scenarios. Empirical results show that PSP achieves a classification error comparable to more sophisticated learning strategies in the MNIST scenario, while improves the generalization compared to ℓ_2 regularization in deep architectures trained on CIFAR-10.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/15/2019

Adaptive Regularization of Labels

Recently, a variety of regularization techniques have been widely applie...
research
05/26/2023

Manifold Regularization for Memory-Efficient Training of Deep Neural Networks

One of the prevailing trends in the machine- and deep-learning community...
research
02/05/2021

On the Reproducibility of Neural Network Predictions

Standard training techniques for neural networks involve multiple source...
research
08/02/2018

Normalization Before Shaking Toward Learning Symmetrically Distributed Representation Without Margin in Speech Emotion Recognition

Regularization is crucial to the success of many practical deep learning...
research
02/22/2020

Stochasticity in Neural ODEs: An Empirical Study

Stochastic regularization of neural networks (e.g. dropout) is a wide-sp...
research
01/24/2020

Stochastic Optimization of Plain Convolutional Neural Networks with Simple methods

Convolutional neural networks have been achieving the best possible accu...
research
06/27/2020

Stochastic Batch Augmentation with An Effective Distilled Dynamic Soft Label Regularizer

Data augmentation have been intensively used in training deep neural net...

Please sign up or login with your details

Forgot password? Click here to reset