Compressive sensing with un-trained neural networks: Gradient descent finds the smoothest approximation

05/07/2020
by   Reinhard Heckel, et al.
0

Un-trained convolutional neural networks have emerged as highly successful tools for image recovery and restoration. They are capable of solving standard inverse problems such as denoising and compressive sensing with excellent results by simply fitting a neural network model to measurements from a single image or signal without the need for any additional training data. For some applications, this critically requires additional regularization in the form of early stopping the optimization. For signal recovery from a few measurements, however, un-trained convolutional networks have an intriguing self-regularizing property: Even though the network can perfectly fit any image, the network recovers a natural image from few measurements when trained with gradient descent until convergence. In this paper, we provide numerical evidence for this property and study it theoretically. We show that—without any further regularization—an un-trained convolutional neural network can approximately reconstruct signals and images that are sufficiently structured, from a near minimal number of random measurements.

READ FULL TEXT

page 2

page 11

research
07/06/2019

Regularizing linear inverse problems with convolutional neural networks

Deep convolutional neural networks trained on large datsets have emerged...
research
10/31/2019

Denoising and Regularization via Exploiting the Structural Bias of Convolutional Generators

Convolutional Neural Networks (CNNs) have emerged as highly successful t...
research
04/18/2019

One-dimensional Deep Image Prior for Time Series Inverse Problems

We extend the Deep Image Prior (DIP) framework to one-dimensional signal...
research
07/11/2017

DeepCodec: Adaptive Sensing and Recovery via Deep Convolutional Neural Networks

In this paper we develop a novel computational sensing framework for sen...
research
05/24/2017

Towards Understanding the Invertibility of Convolutional Neural Networks

Several recent works have empirically observed that Convolutional Neural...
research
12/21/2021

More is Less: Inducing Sparsity via Overparameterization

In deep learning it is common to overparameterize the neural networks, t...
research
06/06/2023

One-Dimensional Deep Image Prior for Curve Fitting of S-Parameters from Electromagnetic Solvers

A key problem when modeling signal integrity for passive filters and int...

Please sign up or login with your details

Forgot password? Click here to reset