Tangent Space Sensitivity and Distribution of Linear Regions in ReLU Networks

06/11/2020
by   Bálint Daróczy, et al.
0

Recent articles indicate that deep neural networks are efficient models for various learning problems. However they are often highly sensitive to various changes that cannot be detected by an independent observer. As our understanding of deep neural networks with traditional generalization bounds still remains incomplete, there are several measures which capture the behaviour of the model in case of small changes at a specific state. In this paper we consider adversarial stability in the tangent space and suggest tangent sensitivity in order to characterize stability. We focus on a particular kind of stability with respect to changes in parameters that are induced by individual examples without known labels. We derive several easily computable bounds and empirical measures for feed-forward fully connected ReLU (Rectified Linear Unit) networks and connect tangent sensitivity to the distribution of the activation regions in the input space realized by the network. Our experiments suggest that even simple bounds and measures are associated with the empirical generalization gap.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/01/2020

On the Number of Linear Regions of Convolutional Neural Networks

One fundamental problem in deep learning is understanding the outstandin...
research
06/05/2018

A Framework for the construction of upper bounds on the number of affine linear regions of ReLU feed-forward neural networks

In this work we present a new framework to derive upper bounds on the nu...
research
07/14/2020

Bounding The Number of Linear Regions in Local Area for Neural Networks with ReLU Activations

The number of linear regions is one of the distinct properties of the ne...
research
10/26/2021

Gradient representations in ReLU networks as similarity functions

Feed-forward networks can be interpreted as mappings with linear decisio...
research
03/31/2021

Using activation histograms to bound the number of affine regions in ReLU feed-forward neural networks

Several current bounds on the maximal number of affine regions of a ReLU...
research
02/23/2018

Sensitivity and Generalization in Neural Networks: an Empirical Study

In practice it is often found that large over-parameterized neural netwo...
research
04/19/2023

Generalization and Estimation Error Bounds for Model-based Neural Networks

Model-based neural networks provide unparalleled performance for various...

Please sign up or login with your details

Forgot password? Click here to reset