Complexity Measures for Neural Networks with General Activation Functions Using Path-based Norms

09/14/2020
by   Zhong Li, et al.
14

A simple approach is proposed to obtain complexity controls for neural networks with general activation functions. The approach is motivated by approximating the general activation functions with one-dimensional ReLU networks, which reduces the problem to the complexity controls of ReLU networks. Specifically, we consider two-layer networks and deep residual networks, for which path-based norms are derived to control complexities. We also provide preliminary analyses of the function spaces induced by these norms and a priori estimates of the corresponding regularized estimators.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/17/2019

Smooth function approximation by deep neural networks with general activation functions

There has been a growing interest in expressivity of deep neural network...
research
04/08/2020

The Loss Surfaces of Neural Networks with General Activation Functions

We present results extending the foundational work of Choromanska et al ...
research
08/10/2021

Linear approximability of two-layer neural networks: A comprehensive analysis based on spectral decay

In this paper, we present a spectral-based approach to study the linear ...
research
10/05/2019

Minimum "Norm" Neural Networks are Splines

We develop a general framework based on splines to understand the interp...
research
07/01/2021

On the Expected Complexity of Maxout Networks

Learning with neural networks relies on the complexity of the representa...
research
04/25/2020

Compromise-free Bayesian neural networks

We conduct a thorough analysis of the relationship between the out-of-sa...
research
10/15/2020

QReLU and m-QReLU: Two novel quantum activation functions to aid medical diagnostics

The ReLU activation function (AF) has been extensively applied in deep n...

Please sign up or login with your details

Forgot password? Click here to reset