Learning Neuron Non-Linearities with Kernel-Based Deep Neural Networks

07/17/2018
by   Giuseppe Marra, et al.
0

The effectiveness of deep neural architectures has been widely supported in terms of both experimental and foundational principles. There is also clear evidence that the activation function (e.g. the rectifier and the LSTM units) plays a crucial role in the complexity of learning. Based on this remark, this paper discusses an optimal selection of the neuron non-linearity in a functional framework that is inspired from classic regularization arguments. It is shown that the best activation function is represented by a kernel expansion in the training set, that can be effectively approximated over an opportune set of points modeling 1-D clusters. The idea can be naturally extended to recurrent networks, where the expressiveness of kernel-based activation functions turns out to be a crucial ingredient to capture long-term dependencies. We give experimental evidence of this property by a set of challenging experiments, where we compare the results with neural architectures based on state of the art LSTM cells.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/21/2014

Learning Activation Functions to Improve Deep Neural Networks

Artificial neural networks typically have a fixed, non-linear activation...
research
01/13/2021

Reproducing Activation Function for Deep Learning

In this paper, we propose the reproducing activation function to improve...
research
09/06/2019

Differential Equation Units: Learning Functional Forms of Activation Functions from Data

Most deep neural networks use simple, fixed activation functions, such a...
research
07/26/2020

Regularized Flexible Activation Function Combinations for Deep Neural Networks

Activation in deep neural networks is fundamental to achieving non-linea...
research
02/06/2019

Widely Linear Kernels for Complex-Valued Kernel Activation Functions

Complex-valued neural networks (CVNNs) have been shown to be powerful no...
research
07/11/2018

Recurrent Neural Networks with Flexible Gates using Kernel Activation Functions

Gated recurrent neural networks have achieved remarkable results in the ...
research
12/07/2017

Solving internal covariate shift in deep learning with linked neurons

This work proposes a novel solution to the problem of internal covariate...

Please sign up or login with your details

Forgot password? Click here to reset