Approximating Positive Homogeneous Functions with Scale Invariant Neural Networks

08/05/2023
by   Stefan Bamberger, et al.
0

We investigate to what extent it is possible to solve linear inverse problems with ReLu networks. Due to the scaling invariance arising from the linearity, an optimal reconstruction function f for such a problem is positive homogeneous, i.e., satisfies f(λ x) = λ f(x) for all non-negative λ. In a ReLu network, this condition translates to considering networks without bias terms. We first consider recovery of sparse vectors from few linear measurements. We prove that ReLu- networks with only one hidden layer cannot even recover 1-sparse vectors, not even approximately, and regardless of the width of the network. However, with two hidden layers, approximate recovery with arbitrary precision and arbitrary sparsity level s is possible in a stable way. We then extend our results to a wider class of recovery problems including low-rank matrix recovery and phase retrieval. Furthermore, we also consider the approximation of general positive homogeneous functions with neural networks. Extending previous work, we establish new results explaining under which conditions such functions can be approximated with neural networks. Our results also shed some light on the seeming contradiction between previous works showing that neural networks for inverse problems typically have very large Lipschitz constants, but still perform very well also for adversarial noise. Namely, the error bounds in our expressivity results include a combination of a small constant term and a term that is linear in the noise level, indicating that robustness issues may occur only for very small noise levels.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/24/2021

WARPd: A linearly convergent first-order method for inverse problems with approximate sharpness conditions

Reconstruction of signals from undersampled and noisy measurements is a ...
research
03/12/2018

Representation Learning and Recovery in the ReLU Model

Rectified linear units, or ReLUs, have become the preferred activation f...
research
09/21/2023

Convergence and Recovery Guarantees of Unsupervised Neural Networks for Inverse Problems

Neural networks have become a prominent approach to solve inverse proble...
research
06/15/2020

Globally Injective ReLU Networks

We study injective ReLU neural networks. Injectivity plays an important ...
research
09/19/2022

Nonlinear approximation spaces for inverse problems

This paper is concerned with the ubiquitous inverse problem of recoverin...
research
02/26/2019

Nonlinear Approximation via Compositions

We study the approximation efficiency of function compositions in nonlin...
research
09/04/2017

Optimal deep neural networks for sparse recovery via Laplace techniques

This paper introduces Laplace techniques for designing a neural network,...

Please sign up or login with your details

Forgot password? Click here to reset