Variational Neural Networks: Every Layer and Neuron Can Be Unique

10/14/2018
by   Yiwei Li, et al.
0

The choice of activation function can significantly influence the performance of neural networks. The lack of guiding principles for the selection of activation function is lamentable. We try to address this issue by introducing our variational neural networks, where the activation function is represented as a linear combination of possible candidate functions, and an optimal activation is obtained via minimization of a loss function using gradient descent method. The gradient formulae for the loss function with respect to these expansion coefficients are central for the implementation of gradient descent algorithm, and here we derive these gradient formulae.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/21/2014

Learning Activation Functions to Improve Deep Neural Networks

Artificial neural networks typically have a fixed, non-linear activation...
research
11/13/2019

Quadratic number of nodes is sufficient to learn a dataset via gradient descent

We prove that if an activation function satisfies some mild conditions a...
research
05/25/2019

Hebbian-Descent

In this work we propose Hebbian-descent as a biologically plausible lear...
research
09/07/2022

A Greedy Algorithm for Building Compact Binary Activated Neural Networks

We study binary activated neural networks in the context of regression t...
research
02/11/2023

Global Convergence Rate of Deep Equilibrium Models with General Activations

In a recent paper, Ling et al. investigated the over-parametrized Deep E...
research
08/28/2017

A parameterized activation function for learning fuzzy logic operations in deep neural networks

We present a deep learning architecture for learning fuzzy logic express...
research
05/07/2018

Polynomial Convergence of Gradient Descent for Training One-Hidden-Layer Neural Networks

We analyze Gradient Descent applied to learning a bounded target functio...

Please sign up or login with your details

Forgot password? Click here to reset