Activation function impact on Sparse Neural Networks

10/12/2020
by   Adam Dubowski, et al.
0

While the concept of a Sparse Neural Network has been researched for some time, researchers have only recently made notable progress in the matter. Techniques like Sparse Evolutionary Training allow for significantly lower computational complexity when compared to fully connected models by reducing redundant connections. That typically takes place in an iterative process of weight creation and removal during network training. Although there have been numerous approaches to optimize the redistribution of the removed weights, there seems to be little or no study on the effect of activation functions on the performance of the Sparse Networks. This research provides insights into the relationship between the activation function used and the network performance at various sparsity levels.

READ FULL TEXT
research
05/19/2023

Complexity of Neural Network Training and ETR: Extensions with Effectively Continuous Functions

We study the complexity of the problem of training neural networks defin...
research
03/18/2020

A Survey on Activation Functions and their relation with Xavier and He Normal Initialization

In artificial neural network, the activation function and the weight ini...
research
06/30/2022

Consensus Function from an L_p^q-norm Regularization Term for its Use as Adaptive Activation Functions in Neural Networks

The design of a neural network is usually carried out by defining the nu...
research
04/06/2023

Optimizing Neural Networks through Activation Function Discovery and Automatic Weight Initialization

Automated machine learning (AutoML) methods improve upon existing models...
research
07/31/2020

Neural Network Degeneration and its Relationship to the Brain

This report discusses the application of neural networks (NNs) as small ...
research
11/20/2018

Fenchel Lifted Networks: A Lagrange Relaxation of Neural Network Training

Despite the recent successes of deep neural networks, the corresponding ...
research
11/12/2022

MixBin: Towards Budgeted Binarization

Binarization has proven to be amongst the most effective ways of neural ...

Please sign up or login with your details

Forgot password? Click here to reset