DeepAI AI Chat
Log In Sign Up

The Kolmogorov Superposition Theorem can Break the Curse of Dimensionality When Approximating High Dimensional Functions

12/18/2021
by   Ming-Jun Lai, et al.
University of Georgia
0

We explain how to use Kolmogorov's Superposition Theorem (KST) to overcome the curse of dimensionality in approximating multi-dimensional functions and learning multi-dimensional data sets by using neural networks of two layers. That is, there is a class of functions called K-Lipschitz continuous in the sense that the K-outer function g of f is Lipschitz continuous can be approximated by a ReLU network of two layers with dn, n widths to have an approximation order O(d^2/n). In addition, we show that polynomials of high degree can be expressed by using neural networks with activation function σ_ℓ(t)=(t_+)^ℓ with ℓ≥ 2 with multiple layers and appropriate widths. More layers of neural networks, the higher degree polynomials can be reproduced. Hence, the deep learning algorithm can well approximate multi-dimensional data when the number of layers increases with high degree activation function σ_ℓ. Finally, we present a mathematical justification for image classification by using a deep learning algorithm.

READ FULL TEXT

page 7

page 8

page 13

04/12/2023

Deep neural network approximation of composite functions without the curse of dimensionality

In this article we identify a general class of high-dimensional continuo...
12/06/2004

Multidimensional data classification with artificial neural networks

Multi-dimensional data classification is an important and challenging pr...
09/30/2020

A law of robustness for two-layers neural networks

We initiate the study of the inherent tradeoffs between the size of a ne...
02/28/2023

On the existence of minimizers in shallow residual ReLU neural network optimization landscapes

Many mathematical convergence results for gradient descent (GD) based al...