Minimum "Norm" Neural Networks are Splines

10/05/2019
by   Rahul Parhi, et al.
0

We develop a general framework based on splines to understand the interpolation properties of overparameterized neural networks. We prove that minimum "norm" two-layer neural networks (with appropriately chosen activation functions) that interpolate scattered data are minimal knot splines. Our results follow from understanding key relationships between notions of neural network "norms", linear operators, and continuous-domain linear inverse problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/14/2020

Complexity Measures for Neural Networks with General Activation Functions Using Path-based Norms

A simple approach is proposed to obtain complexity controls for neural n...
research
06/10/2020

Neural Networks, Ridge Splines, and TV Regularization in the Radon Domain

We develop a variational framework to understand the properties of the f...
research
08/01/2022

Neural network layers as parametric spans

Properties such as composability and automatic differentiation made arti...
research
04/08/2018

Comparison of non-linear activation functions for deep neural networks on MNIST classification task

Activation functions play a key role in neural networks so it becomes fu...
research
12/06/2021

Data-driven forward-inverse problems for Yajima-Oikawa system using deep learning with parameter regularization

We investigate data-driven forward-inverse problems for Yajima-Oikawa (Y...
research
07/04/2019

Neural Networks, Hypersurfaces, and Radon Transforms

Connections between integration along hypersufaces, Radon transforms, an...
research
05/29/2022

Continuous Generative Neural Networks

In this work, we present and study Continuous Generative Neural Networks...

Please sign up or login with your details

Forgot password? Click here to reset