Activation Functions: Do They Represent A Trade-Off Between Modular Nature of Neural Networks And Task Performance

09/16/2020
by   Himanshu Pradeep Aswani, et al.
0

Current research suggests that the key factors in designing neural network architectures involve choosing number of filters for every convolution layer, number of hidden neurons for every fully connected layer, dropout and pruning. The default activation function in most cases is the ReLU, as it has empirically shown faster training convergence. We explore whether ReLU is the best choice if one is aiming to desire better modularity structure within a neural network.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset