DeepAI
Log In Sign Up

What Kinds of Functions do Deep Neural Networks Learn? Insights from Variational Spline Theory

05/07/2021
by   Rahul Parhi, et al.
0

We develop a variational framework to understand the properties of functions learned by deep neural networks with ReLU activation functions fit to data. We propose a new function space, which is reminiscent of classical bounded variation spaces, that captures the compositional structure associated with deep neural networks. We derive a representer theorem showing that deep ReLU networks are solutions to regularized data fitting problems in this function space. The function space consists of compositions of functions from the (non-reflexive) Banach spaces of second-order bounded variation in the Radon domain. These are Banach spaces with sparsity-promoting norms, giving insight into the role of sparsity in deep neural networks. The neural network solutions have skip connections and rank bounded weight matrices, providing new theoretical support for these common architectural choices. The variational problem we study can be recast as a finite-dimensional neural network training problem with regularization schemes related to the notions of weight decay and path-norm regularization. Finally, our analysis builds on techniques from variational spline theory, providing new connections between deep neural networks and splines.

READ FULL TEXT
06/10/2020

Neural Networks, Ridge Splines, and TV Regularization in the Radon Domain

We develop a variational framework to understand the properties of the f...
09/18/2021

Near-Minimax Optimal Estimation With Shallow ReLU Neural Networks

We study the problem of estimating an unknown function from noisy data u...
02/26/2018

A representer theorem for deep neural networks

We propose to optimize the activation functions of a deep neural network...
09/14/2020

Complexity Measures for Neural Networks with General Activation Functions Using Path-based Norms

A simple approach is proposed to obtain complexity controls for neural n...
02/22/2020

Convex Duality of Deep Neural Networks

We study regularized deep neural networks and introduce an analytic fram...
06/29/2022

From Kernel Methods to Neural Networks: A Unifying Variational Formulation

The minimization of a data-fidelity term and an additive regularization ...
04/20/2022

Deep Learning meets Nonparametric Regression: Are Weight-Decayed DNNs Locally Adaptive?

We study the theory of neural network (NN) from the lens of classical no...