Size and depth of monotone neural networks: interpolation and approximation

07/12/2022
by   Dan Mikulincer, et al.
0

Monotone functions and data sets arise in a variety of applications. We study the interpolation problem for monotone data sets: The input is a monotone data set with n points, and the goal is to find a size and depth efficient monotone neural network, with non negative parameters and threshold units, that interpolates the data set. We show that there are monotone data sets that cannot be interpolated by a monotone network of depth 2. On the other hand, we prove that for every monotone data set with n points in ℝ^d, there exists an interpolating monotone network of depth 4 and size O(nd). Our interpolation result implies that every monotone function over [0,1]^d can be approximated arbitrarily well by a depth-4 monotone network, improving the previous best-known construction of depth d+1. Finally, building on results from Boolean circuit complexity, we show that the inductive bias of having positive parameters can lead to a super-polynomial blow-up in the number of neurons when approximating monotone functions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset