Universality and Optimality of Structured Deep Kernel Networks

05/15/2021
by   Tizian Wenzel, et al.
0

Kernel based methods yield approximation models that are flexible, efficient and powerful. In particular, they utilize fixed feature maps of the data, being often associated to strong analytical results that prove their accuracy. On the other hand, the recent success of machine learning methods has been driven by deep neural networks (NNs). They achieve a significant accuracy on very high-dimensional data, in that they are able to learn also efficient data representations or data-based feature maps. In this paper, we leverage a recent deep kernel representer theorem to connect the two approaches and understand their interplay. In particular, we show that the use of special types of kernels yield models reminiscent of neural networks that are founded in the same theoretical framework of classical kernel methods, while enjoying many computational properties of deep neural networks. Especially the introduced Structured Deep Kernel Networks (SDKNs) can be viewed as neural networks with optimizable activation functions obeying a representer theorem. Analytic properties show their universal approximation properties in different asymptotic regimes of unbounded number of centers, width and depth. Especially in the case of unbounded depth, the constructions is asymptotically better than corresponding constructions for ReLU neural networks, which is made possible by the flexibility of kernel approximation

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/01/2021

Simultaneous Neural Network Approximations in Sobolev Spaces

We establish in this work approximation results of deep neural networks ...
research
01/30/2023

Optimal Approximation Complexity of High-Dimensional Functions with Neural Networks

We investigate properties of neural networks that use both ReLU and x^2 ...
research
09/23/2022

Achieve the Minimum Width of Neural Networks for Universal Approximation

The universal approximation property (UAP) of neural networks is fundame...
research
04/03/2021

Random Features for the Neural Tangent Kernel

The Neural Tangent Kernel (NTK) has discovered connections between deep ...
research
03/12/2015

Compact Nonlinear Maps and Circulant Extensions

Kernel approximation via nonlinear random feature maps is widely used in...
research
03/25/2021

Structured Deep Kernel Networks for Data-Driven Closure Terms of Turbulent Flows

Standard kernel methods for machine learning usually struggle when deali...
research
10/05/2020

On the Universality of the Double Descent Peak in Ridgeless Regression

We prove a non-asymptotic distribution-independent lower bound for the e...

Please sign up or login with your details

Forgot password? Click here to reset