Multi-Activation Hidden Units for Neural Networks with Random Weights

09/06/2020
by   Ajay M. Patrikar, et al.
0

Single layer feedforward networks with random weights are successful in a variety of classification and regression problems. These networks are known for their non-iterative and fast training algorithms. A major drawback of these networks is that they require a large number of hidden units. In this paper, we propose the use of multi-activation hidden units. Such units increase the number of tunable parameters and enable formation of complex decision surfaces, without increasing the number of hidden units. We experimentally show that multi-activation hidden units can be used either to improve the classification accuracy, or to reduce computations.

READ FULL TEXT
research
08/24/2020

Efficient Design of Neural Networks with Random Weights

Single layer feedforward networks with random weights are known for thei...
research
03/24/2015

Universal Approximation of Markov Kernels by Shallow Stochastic Feedforward Networks

We establish upper bounds for the minimal number of hidden units for whi...
research
07/05/2019

Prior Activation Distribution (PAD): A Versatile Representation to Utilize DNN Hidden Units

In this paper, we introduce the concept of Prior Activation Distribution...
research
08/22/2017

Learning Combinations of Sigmoids Through Gradient Estimation

We develop a new approach to learn the parameters of regression models w...
research
09/23/2010

A Constructive Algorithm for Feedforward Neural Networks for Medical Diagnostic Reasoning

This research is to search for alternatives to the resolution of complex...
research
01/16/2018

Empirical Explorations in Training Networks with Discrete Activations

We present extensive experiments training and testing hidden units in de...
research
02/21/2008

Testing the number of parameters with multidimensional MLP

This work concerns testing the number of parameters in one hidden layer ...

Please sign up or login with your details

Forgot password? Click here to reset