Improving the Sample-Complexity of Deep Classification Networks with Invariant Integration

02/08/2022
by   Matthias Rath, et al.
0

Leveraging prior knowledge on intraclass variance due to transformations is a powerful method to improve the sample complexity of deep neural networks. This makes them applicable to practically important use-cases where training data is scarce. Rather than being learned, this knowledge can be embedded by enforcing invariance to those transformations. Invariance can be imposed using group-equivariant convolutions followed by a pooling operation. For rotation-invariance, previous work investigated replacing the spatial pooling operation with invariant integration which explicitly constructs invariant representations. Invariant integration uses monomials which are selected using an iterative approach requiring expensive pre-training. We propose a novel monomial selection algorithm based on pruning methods to allow an application to more complex problems. Additionally, we replace monomials with different functions such as weighted sums, multi-layer perceptrons and self-attention, thereby streamlining the training of invariant-integration-based architectures. We demonstrate the improved sample complexity on the Rotated-MNIST, SVHN and CIFAR-10 datasets where rotation-invariant-integration-based Wide-ResNet architectures using monomials and weighted sums outperform the respective baselines in the limited sample regime. We achieve state-of-the-art results using full data on Rotated-MNIST and SVHN where rotation is a main source of intraclass variation. On STL-10 we outperform a standard and a rotation-equivariant convolutional neural network using pooling.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/02/2023

Deep Neural Networks with Efficient Guaranteed Invariances

We address the problem of improving the performance and in particular th...
research
04/20/2020

Invariant Integration in Deep Convolutional Feature Space

In this contribution, we show how to incorporate prior knowledge to a de...
research
04/21/2016

TI-POOLING: transformation-invariant pooling for feature learning in Convolutional Neural Networks

In this paper we present a deep neural network topology that incorporate...
research
05/30/2023

Group Invariant Global Pooling

Much work has been devoted to devising architectures that build group-eq...
research
12/02/2021

Invariant Priors for Bayesian Quadrature

Bayesian quadrature (BQ) is a model-based numerical integration method t...
research
08/01/2022

A Rotation Meanout Network with Invariance for Dermoscopy Image Classification and Retrieval

The computer-aided diagnosis (CAD) system can provide a reference basis ...
research
09/14/2016

Warped Convolutions: Efficient Invariance to Spatial Transformations

Convolutional Neural Networks (CNNs) are extremely efficient, since they...

Please sign up or login with your details

Forgot password? Click here to reset