Density-embedding layers: a general framework for adaptive receptive fields

06/23/2020
by   Francesco Cicala, et al.
0

The effectiveness and performance of artificial neural networks, particularly for visual tasks, depends in crucial ways on the receptive field of neurons. The receptive field itself depends on the interplay between several architectural aspects, including sparsity, pooling, and activation functions. In recent literature there are several ad hoc proposals trying to make receptive fields more flexible and adaptive to data. For instance, different parameterizations of convolutional and pooling layers have been proposed to increase their adaptivity. In this paper, we propose the novel theoretical framework of density-embedded layers, generalizing the transformation represented by a neuron. Specifically, the affine transformation applied on the input is replaced by a scalar product of the input, suitably represented as a piecewise constant function, with a density function associated with the neuron. This density is shown to describe directly the receptive field of the neuron. Crucially, by suitably representing such a density as a linear combination of a parametric family of functions, we can efficiently train the densities by means of any automatic differentiation system, making it adaptable to the problem at hand, and computationally efficient to evaluate. This framework captures and generalizes recent methods, allowing a fine tuning of the receptive field. In the paper, we define some novel layers and we experimentally validate them on the classic MNIST dataset.

READ FULL TEXT
research
08/05/2017

Depth Adaptive Deep Neural Network for Semantic Segmentation

In this work, we present the depth-adaptive deep neural network using a ...
research
08/31/2018

An Adaptive Locally Connected Neuron Model: Focusing Neuron

We present a new artificial neuron model capable of learning its recepti...
research
10/22/2020

Factorized Neural Processes for Neural Processes: K-Shot Prediction of Neural Responses

In recent years, artificial neural networks have achieved state-of-the-a...
research
11/09/2015

Batch-normalized Maxout Network in Network

This paper reports a novel deep architecture referred to as Maxout netwo...
research
01/24/2023

Progressive Meta-Pooling Learning for Lightweight Image Classification Model

Practical networks for edge devices adopt shallow depth and small convol...
research
01/12/2018

Deep saliency: What is learnt by a deep network about saliency?

Deep convolutional neural networks have achieved impressive performance ...
research
02/10/2022

Deep Learning in Random Neural Fields: Numerical Experiments via Neural Tangent Kernel

A biological neural network in the cortex forms a neural field. Neurons ...

Please sign up or login with your details

Forgot password? Click here to reset