Local Identifiability of Deep ReLU Neural Networks: the Theory

06/15/2022
by   Joachim Bona-Pellissier, et al.
0

Is a sample rich enough to determine, at least locally, the parameters of a neural network? To answer this question, we introduce a new local parameterization of a given deep ReLU neural network by fixing the values of some of its weights. This allows us to define local lifting operators whose inverses are charts of a smooth manifold of a high dimensional space. The function implemented by the deep ReLU neural network composes the local lifting with a linear operator which depends on the sample. We derive from this convenient representation a geometrical necessary and sufficient condition of local identifiability. Looking at tangent spaces, the geometrical condition provides: 1/ a sharp and testable necessary condition of identifiability and 2/ a sharp and testable sufficient condition of local identifiability. The validity of the conditions can be tested numerically using backpropagation and matrix rank computations.

READ FULL TEXT
research
11/27/2019

How Much Over-parameterization Is Sufficient to Learn Deep ReLU Networks?

A recent line of research on deep learning focuses on the extremely over...
research
07/20/2021

An Embedding of ReLU Networks and an Analysis of their Identifiability

Neural networks with the Rectified Linear Unit (ReLU) nonlinearity are d...
research
02/02/2023

Sharp Lower Bounds on Interpolation by Deep ReLU Neural Networks at Irregularly Spaced Data

We study the interpolation, or memorization, power of deep ReLU neural n...
research
06/06/2023

Globally injective and bijective neural operators

Recently there has been great interest in operator learning, where netwo...
research
08/17/2020

Unitary Learning for Deep Diffractive Neural Network

Realization of deep learning with coherent diffraction has achieved rema...
research
04/12/2022

Local and global topological complexity measures OF ReLU neural network functions

We apply a generalized piecewise-linear (PL) version of Morse theory due...
research
03/10/2022

Stable Parametrization of Continuous and Piecewise-Linear Functions

Rectified-linear-unit (ReLU) neural networks, which play a prominent rol...

Please sign up or login with your details

Forgot password? Click here to reset