Deep Neural Networks with Efficient Guaranteed Invariances

03/02/2023
by   Matthias Rath, et al.
0

We address the problem of improving the performance and in particular the sample complexity of deep neural networks by enforcing and guaranteeing invariances to symmetry transformations rather than learning them from data. Group-equivariant convolutions are a popular approach to obtain equivariant representations. The desired corresponding invariance is then imposed using pooling operations. For rotations, it has been shown that using invariant integration instead of pooling further improves the sample complexity. In this contribution, we first expand invariant integration beyond rotations to flips and scale transformations. We then address the problem of incorporating multiple desired invariances into a single network. For this purpose, we propose a multi-stream architecture, where each stream is invariant to a different transformation such that the network can simultaneously benefit from multiple invariances. We demonstrate our approach with successful experiments on Scaled-MNIST, SVHN, CIFAR-10 and STL-10.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/08/2022

Improving the Sample-Complexity of Deep Classification Networks with Invariant Integration

Leveraging prior knowledge on intraclass variance due to transformations...
research
04/20/2020

Invariant Integration in Deep Convolutional Feature Space

In this contribution, we show how to incorporate prior knowledge to a de...
research
08/21/2018

Isometric Transformation Invariant Graph-based Deep Neural Network

Learning transformation invariant representations of visual data is an i...
research
09/27/2021

Learning from Small Samples: Transformation-Invariant SVMs with Composition and Locality at Multiple Scales

Motivated by the problem of learning when the number of training samples...
research
04/01/2014

A Deep Representation for Invariance And Music Classification

Representations in the auditory cortex might be based on mechanisms simi...
research
11/30/2020

Scale-covariant and scale-invariant Gaussian derivative networks

This article presents a hybrid approach between scale-space theory and d...
research
11/25/2019

Translation Insensitive CNNs

We address the problem that state-of-the-art Convolution Neural Networks...

Please sign up or login with your details

Forgot password? Click here to reset