Complexity Measures for Neural Networks with General Activation Functions Using Path-based Norms

by   Zhong Li, et al.

A simple approach is proposed to obtain complexity controls for neural networks with general activation functions. The approach is motivated by approximating the general activation functions with one-dimensional ReLU networks, which reduces the problem to the complexity controls of ReLU networks. Specifically, we consider two-layer networks and deep residual networks, for which path-based norms are derived to control complexities. We also provide preliminary analyses of the function spaces induced by these norms and a priori estimates of the corresponding regularized estimators.


page 1

page 2

page 3

page 4


Smooth function approximation by deep neural networks with general activation functions

There has been a growing interest in expressivity of deep neural network...

The Loss Surfaces of Neural Networks with General Activation Functions

We present results extending the foundational work of Choromanska et al ...

Linear approximability of two-layer neural networks: A comprehensive analysis based on spectral decay

In this paper, we present a spectral-based approach to study the linear ...

Minimum "Norm" Neural Networks are Splines

We develop a general framework based on splines to understand the interp...

MorphoActivation: Generalizing ReLU activation function by mathematical morphology

This paper analyses both nonlinear activation functions and spatial max-...

On the Expected Complexity of Maxout Networks

Learning with neural networks relies on the complexity of the representa...

What Kinds of Functions do Deep Neural Networks Learn? Insights from Variational Spline Theory

We develop a variational framework to understand the properties of funct...