Expressiveness of Neural Networks Having Width Equal or Below the Input Dimension

11/10/2020
by   Hans-Peter Beise, et al.
1

The expressiveness of deep neural networks of bounded width has recently been investigated in a series of articles. The understanding about the minimum width needed to ensure universal approximation for different kind of activation functions has progressively been extended (Park et al., 2020). In particular, it turned out that, with respect to approximation on general compact sets in the input space, a network width less than or equal to the input dimension excludes universal approximation. In this work, we focus on network functions of width less than or equal to the latter critical bound. We prove that in this regime the exact fit of partially constant functions on disjoint compact sets is still possible for ReLU network functions under some conditions on the mutual location of these components. Conversely, we conclude from a maximum principle that for all continuous and monotonic activation functions, universal approximation of arbitrary continuous functions is impossible on sets that coincide with the boundary of an open set plus an inner point of that set. We also show that some network functions of maximum width two, respectively one, allow universal approximation on finite sets.

READ FULL TEXT

page 11

page 17

research
09/23/2022

Achieve the Minimum Width of Neural Networks for Universal Approximation

The universal approximation property (UAP) of neural networks is fundame...
research
06/16/2020

Minimum Width for Universal Approximation

The universal approximation property of width-bounded networks has been ...
research
09/19/2023

Minimum width for universal approximation using ReLU networks on compact domain

The universal approximation property of width-bounded networks has been ...
research
05/21/2019

Universal Approximation with Deep Narrow Networks

The classical Universal Approximation Theorem certifies that the univers...
research
06/13/2019

Neural Networks on Groups

Recent work on neural networks has shown that allowing them to build int...
research
07/06/2021

Deep Network Approximation: Achieving Arbitrary Accuracy with Fixed Number of Neurons

This paper develops simple feed-forward neural networks that achieve the...
research
11/22/2018

Tight Approximation for Unconstrained XOS Maximization

A set function is called XOS if it can be represented by the maximum of ...

Please sign up or login with your details

Forgot password? Click here to reset