Complexity Measures for Neural Networks with General Activation Functions Using Path-based Norms

09/14/2020
by   Zhong Li, et al.
14

A simple approach is proposed to obtain complexity controls for neural networks with general activation functions. The approach is motivated by approximating the general activation functions with one-dimensional ReLU networks, which reduces the problem to the complexity controls of ReLU networks. Specifically, we consider two-layer networks and deep residual networks, for which path-based norms are derived to control complexities. We also provide preliminary analyses of the function spaces induced by these norms and a priori estimates of the corresponding regularized estimators.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset