DeepLABNet: End-to-end Learning of Deep Radial Basis Networks with Fully Learnable Basis Functions

11/21/2019
by   Andrew Hryniowski, et al.
0

From fully connected neural networks to convolutional neural networks, the learned parameters within a neural network have been primarily relegated to the linear parameters (e.g., convolutional filters). The non-linear functions (e.g., activation functions) have largely remained, with few exceptions in recent years, parameter-less, static throughout training, and seen limited variation in design. Largely ignored by the deep learning community, radial basis function (RBF) networks provide an interesting mechanism for learning more complex non-linear activation functions in addition to the linear parameters in a network. However, the interest in RBF networks has waned over time due to the difficulty of integrating RBFs into more complex deep neural network architectures in a tractable and stable manner. In this work, we present a novel approach that enables end-to-end learning of deep RBF networks with fully learnable activation basis functions in an automatic and tractable manner. We demonstrate that our approach for enabling the use of learnable activation basis functions in deep neural networks, which we will refer to as DeepLABNet, is an effective tool for automated activation function learning within complex network architectures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/21/2014

Learning Activation Functions to Improve Deep Neural Networks

Artificial neural networks typically have a fixed, non-linear activation...
research
10/28/2019

Growing axons: greedy learning of neural networks with application to function approximation

We propose a new method for learning deep neural network models that is ...
research
06/23/2019

Learning Activation Functions: A new paradigm of understanding Neural Networks

There has been limited research in the domain of activation functions, m...
research
01/04/2022

An unfeasability view of neural network learning

We define the notion of a continuously differentiable perfect learning a...
research
07/12/2019

ACTNET: end-to-end learning of feature activations and aggregation for effective instance image retrieval

We propose a novel CNN architecture called ACTNET for robust instance im...
research
11/10/2018

PolyNeuron: Automatic Neuron Discovery via Learned Polyharmonic Spline Activations

Automated deep neural network architecture design has received a signifi...
research
11/08/2018

Activation Functions: Comparison of trends in Practice and Research for Deep Learning

Deep neural networks have been successfully used in diverse emerging dom...

Please sign up or login with your details

Forgot password? Click here to reset