Log In Sign Up

A Learnable ScatterNet: Locally Invariant Convolutional Layers

by   Fergal Cotter, et al.

In this paper we explore tying together the ideas from Scattering Transforms and Convolutional Neural Networks (CNN) for Image Analysis by proposing a learnable ScatterNet. Previous attempts at tying them together in hybrid networks have tended to keep the two parts separate, with the ScatterNet forming a fixed front end and a CNN forming a learned backend. We instead look at adding learning between scattering orders, as well as adding learned layers before the ScatterNet. We do this by breaking down the scattering orders into single convolutional-like layers we call 'locally invariant' layers, and adding a learned mixing term to this layer. Our experiments show that these locally invariant layers can improve accuracy when added to either a CNN or a ScatterNet. We also discover some surprising results in that the ScatterNet may be best positioned after one or more layers of learning rather than at the front of a neural network.


Transformée en scattering sur la spirale temps-chroma-octave

We introduce a scattering representation for the analysis and classifica...

Graph Convolutional Neural Networks via Scattering

We generalize the scattering transform to graphs and consequently constr...

Scattering Networks for Hybrid Representation Learning

Scattering networks are a class of designed Convolutional Neural Network...

A Hybrid Scattering Transform for Signals with Isolated Singularities

The scattering transform is a wavelet-based model of Convolutional Neura...

Learnable Polyphase Sampling for Shift Invariant and Equivariant Convolutional Networks

We propose learnable polyphase sampling (LPS), a pair of learnable down/...

Scaling the Scattering Transform: Deep Hybrid Networks

We use the scattering network as a generic and fixed ini-tialization of ...

Separation and Concentration in Deep Networks

Numerical experiments demonstrate that deep neural network classifiers p...