Riesz networks: scale invariant neural networks in a single forward pass

05/08/2023
by   Tin Barisin, et al.
0

Scale invariance of an algorithm refers to its ability to treat objects equally independently of their size. For neural networks, scale invariance is typically achieved by data augmentation. However, when presented with a scale far outside the range covered by the training set, neural networks may fail to generalize. Here, we introduce the Riesz network, a novel scale invariant neural network. Instead of standard 2d or 3d convolutions for combining spatial information, the Riesz network is based on the Riesz transform which is a scale equivariant operation. As a consequence, this network naturally generalizes to unseen or even arbitrary scales in a single forward pass. As an application example, we consider detecting and segmenting cracks in tomographic images of concrete. In this context, 'scale' refers to the crack thickness which may vary strongly even within the same sample. To prove its scale invariance, the Riesz network is trained on one fixed crack width. We then validate its performance in segmenting simulated and real tomographic images featuring a wide range of crack widths. An additional experiment is carried out on the MNIST Large Scale data set.

READ FULL TEXT

page 19

page 23

page 24

page 27

page 30

page 31

page 33

page 34

research
04/03/2020

Exploring the ability of CNNs to generalise to previously unseen scales over wide scale ranges

The ability to handle large scale variations is crucial for many real wo...
research
06/11/2021

Scale-invariant scale-channel networks: Deep networks that generalise to previously unseen scales

The ability to handle large scale variations is crucial for many real wo...
research
10/07/2022

In What Ways Are Deep Neural Networks Invariant and How Should We Measure This?

It is often said that a deep learning model is "invariant" to some speci...
research
01/02/2020

DeepFocus: a Few-Shot Microscope Slide Auto-Focus using a Sample Invariant CNN-based Sharpness Function

Autofocus (AF) methods are extensively used in biomicroscopy, for exampl...
research
05/29/2022

Saliency Map Based Data Augmentation

Data augmentation is a commonly applied technique with two seemingly rel...
research
09/14/2016

Warped Convolutions: Efficient Invariance to Spatial Transformations

Convolutional Neural Networks (CNNs) are extremely efficient, since they...
research
07/07/2021

Self-organized criticality in neural networks

We demonstrate, both analytically and numerically, that learning dynamic...

Please sign up or login with your details

Forgot password? Click here to reset