Scaling Laws For Deep Learning Based Image Reconstruction

09/27/2022
by   Tobit Klug, et al.
0

Deep neural networks trained end-to-end to map a measurement of a (noisy) image to a clean image perform excellent for a variety of linear inverse problems. Current methods are only trained on a few hundreds or thousands of images as opposed to the millions of examples deep networks are trained on in other domains. In this work, we study whether major performance gains are expected from scaling up the training set size. We consider image denoising, accelerated magnetic resonance imaging, and super-resolution and empirically determine the reconstruction quality as a function of training set size, while optimally scaling the network size. For all three tasks we find that an initially steep power-law scaling slows significantly already at moderate training set sizes. Interpolating those scaling laws suggests that even training on millions of images would not significantly improve performance. To understand the expected behavior, we analytically characterize the performance of a linear estimator learned with early stopped gradient descent. The result formalizes the intuition that once the error induced by learning the signal model is small relative to the error floor, more training examples do not improve performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/06/2020

Can Un-trained Neural Networks Compete with Trained Neural Networks at Image Reconstruction?

Convolutional Neural Networks (CNNs) are highly effective for image reco...
research
06/29/2022

Beyond neural scaling laws: beating power law scaling via data pruning

Widely observed neural scaling laws, in which error falls off as a power...
research
07/24/2023

Learning Provably Robust Estimators for Inverse Problems via Jittering

Deep neural networks provide excellent performance for inverse problems ...
research
11/30/2020

Model Adaptation for Inverse Problems in Imaging

Deep neural networks have been applied successfully to a wide variety of...
research
05/30/2023

Analyzing the Sample Complexity of Self-Supervised Image Reconstruction Methods

Supervised training of deep neural networks on pairs of clean image and ...
research
10/26/2017

Phase Transitions in Image Denoising via Sparsely Coding Convolutional Neural Networks

Neural networks are analogous in many ways to spin glasses, systems whic...
research
11/21/2021

Bilevel learning of l1-regularizers with closed-form gradients(BLORC)

We present a method for supervised learning of sparsity-promoting regula...

Please sign up or login with your details

Forgot password? Click here to reset