A model is worth tens of thousands of examples

03/19/2023
by   Thomas Dagès, et al.
0

Traditional signal processing methods relying on mathematical data generation models have been cast aside in favour of deep neural networks, which require vast amounts of data. Since the theoretical sample complexity is nearly impossible to evaluate, these amounts of examples are usually estimated with crude rules of thumb. However, these rules only suggest when the networks should work, but do not relate to the traditional methods. In particular, an interesting question is: how much data is required for neural networks to be on par or outperform, if possible, the traditional model-based methods? In this work, we empirically investigate this question in two simple examples, where the data is generated according to precisely defined mathematical models, and where well-understood optimal or state-of-the-art mathematical data-agnostic solutions are known. A first problem is deconvolving one-dimensional Gaussian signals and a second one is estimating a circle's radius and location in random grayscale images of disks. By training various networks, either naive custom designed or well-established ones, with various amounts of training data, we find that networks require tens of thousands of examples in comparison to the traditional methods, whether the networks are trained from scratch or even with transfer-learning or finetuning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/23/2023

Deep Learning Meets Sparse Regularization: A Signal Processing Perspective

Deep learning has been wildly successful in practice and most state-of-t...
research
12/15/2020

Model-Based Deep Learning

Signal processing, communications, and control have traditionally relied...
research
11/07/2019

This dataset does not exist: training models from generated images

Current generative networks are increasingly proficient in generating hi...
research
02/13/2018

Turning Your Weakness Into a Strength: Watermarking Deep Neural Networks by Backdooring

Deep Neural Networks have recently gained lots of success after enabling...
research
03/14/2022

MDNet: Learning Monaural Speech Enhancement from Deep Prior Gradient

While traditional statistical signal processing model-based methods can ...
research
05/13/2018

Doing the impossible: Why neural networks can be trained at all

As deep neural networks grow in size, from thousands to millions to bill...
research
05/19/2021

Compositional Processing Emerges in Neural Networks Solving Math Problems

A longstanding question in cognitive science concerns the learning mecha...

Please sign up or login with your details

Forgot password? Click here to reset