Deep Convolutional Networks are Hierarchical Kernel Machines

08/05/2015
by   Fabio Anselmi, et al.
0

In i-theory a typical layer of a hierarchical architecture consists of HW modules pooling the dot products of the inputs to the layer with the transformations of a few templates under a group. Such layers include as special cases the convolutional layers of Deep Convolutional Networks (DCNs) as well as the non-convolutional layers (when the group contains only the identity). Rectifying nonlinearities -- which are used by present-day DCNs -- are one of the several nonlinearities admitted by i-theory for the HW module. We discuss here the equivalence between group averages of linear combinations of rectifying nonlinearities and an associated kernel. This property implies that present-day DCNs can be exactly equivalent to a hierarchy of kernel machines with pooling and non-pooling layers. Finally, we describe a conjecture for theoretically understanding hierarchies of such modules. A main consequence of the conjecture is that hierarchies of trained HW modules minimize memory requirements while computing a selective and invariant representation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/24/2017

How ConvNets model Non-linear Transformations

In this paper, we theoretically address three fundamental problems invol...
research
07/13/2020

Deep Neural-Kernel Machines

In this chapter we review the main literature related to the recent adva...
research
04/01/2014

A Deep Representation for Invariance And Music Classification

Representations in the auditory cortex might be based on mechanisms simi...
research
05/20/2022

Kernel Normalized Convolutional Networks

Existing deep convolutional neural network (CNN) architectures frequentl...
research
05/21/2019

Geometry of Deep Convolutional Networks

We give a formal procedure for computing preimages of convolutional netw...
research
02/19/2021

On Approximation in Deep Convolutional Networks: a Kernel Perspective

The success of deep convolutional networks on on tasks involving high-di...
research
02/07/2020

Attentive Group Equivariant Convolutional Networks

Although group convolutional networks are able to learn powerful represe...

Please sign up or login with your details

Forgot password? Click here to reset