Normalization-Equivariant Neural Networks with Application to Image Denoising

06/08/2023
by   Sébastien Herbreteau, et al.
0

In many information processing systems, it may be desirable to ensure that any change of the input, whether by shifting or scaling, results in a corresponding change in the system response. While deep neural networks are gradually replacing all traditional automatic processing methods, they surprisingly do not guarantee such normalization-equivariance (scale + shift) property, which can be detrimental in many applications. To address this issue, we propose a methodology for adapting existing neural networks so that normalization-equivariance holds by design. Our main claim is that not only ordinary convolutional layers, but also all activation functions, including the ReLU (rectified linear unit), which are applied element-wise to the pre-activated neurons, should be completely removed from neural networks and replaced by better conditioned alternatives. To this end, we introduce affine-constrained convolutions and channel-wise sort pooling layers as surrogates and show that these two architectural modifications do preserve normalization-equivariance without loss of performance. Experimental results in image denoising show that normalization-equivariant neural networks, in addition to their better conditioning, also provide much better generalization across noise levels.

READ FULL TEXT

page 4

page 6

page 8

page 14

page 15

research
09/10/2016

Rectifier Neural Network with a Dual-Pathway Architecture for Image Denoising

Recently deep neural networks based on tanh activation function have sho...
research
08/14/2017

An ELU Network with Total Variation for Image Denoising

In this paper, we propose a novel convolutional neural network (CNN) for...
research
02/24/2020

Breaking Batch Normalization for better explainability of Deep Neural Networks through Layer-wise Relevance Propagation

The lack of transparency of neural networks stays a major break for thei...
research
05/04/2017

Pixel Normalization from Numeric Data as Input to Neural Networks

Text to image transformation for input to neural networks requires inter...
research
04/06/2020

Evolving Normalization-Activation Layers

Normalization layers and activation functions are critical components in...
research
09/26/2018

Rediscovering Deep Neural Networks in Finite-State Distributions

We propose a new way of thinking about deep neural networks, in which th...
research
03/28/2022

To Fold or Not to Fold: a Necessary and Sufficient Condition on Batch-Normalization Layers Folding

Batch-Normalization (BN) layers have become fundamental components in th...

Please sign up or login with your details

Forgot password? Click here to reset