Kafnets: kernel-based non-parametric activation functions for neural networks

07/13/2017
by   Simone Scardapane, et al.
0

Neural networks are generally built by interleaving (adaptable) linear layers with (fixed) nonlinear activation functions. To increase their flexibility, several authors have proposed methods for adapting the activation functions themselves, endowing them with varying degrees of flexibility. None of these approaches, however, have gained wide acceptance in practice, and research in this topic remains open. In this paper, we introduce a novel family of flexible activation functions that are based on an inexpensive kernel expansion at every neuron. Leveraging over several properties of kernel-based models, we propose multiple variations for designing and initializing these kernel activation functions (KAFs), including a multidimensional scheme allowing to nonlinearly combine information from different paths in the network. The resulting KAFs can approximate any mapping defined over a subset of the real line, either convex or nonconvex. Furthermore, they are smooth over their entire domain, linear in their parameters, and they can be regularized using any known scheme, including the use of ℓ_1 penalties to enforce sparseness. To the best of our knowledge, no other known model satisfies all these properties simultaneously. In addition, we provide a relatively complete overview on alternative techniques for adapting the activation functions, which is currently lacking in the literature. A large set of experiments validates our proposal.

READ FULL TEXT
research
02/22/2018

Complex-valued Neural Networks with Non-parametric Activation Functions

Complex-valued neural networks (CVNNs) are a powerful modeling tool for ...
research
03/28/2019

On the Stability and Generalization of Learning with Kernel Activation Functions

In this brief we investigate the generalization properties of a recently...
research
02/06/2019

Widely Linear Kernels for Complex-Valued Kernel Activation Functions

Complex-valued neural networks (CVNNs) have been shown to be powerful no...
research
01/29/2019

Multikernel activation functions: formulation and a case study

The design of activation functions is a growing research area in the fie...
research
03/05/2016

Network Morphism

We present in this paper a systematic study on how to morph a well-train...
research
05/18/2016

Learning activation functions from data using cubic spline interpolation

Neural networks require a careful design in order to perform properly on...
research
10/22/2019

Improving Siamese Networks for One Shot Learning using Kernel Based Activation functions

The lack of a large amount of training data has always been the constrai...

Please sign up or login with your details

Forgot password? Click here to reset