Large Deviation Analysis of Function Sensitivity in Random Deep Neural Networks

10/13/2019
by   Bo Li, et al.
0

Mean field theory has been successfully used to analyze deep neural networks (DNN) in the infinite size limit. Given the finite size of realistic DNN, we utilize the large deviation theory and path integral analysis to study the deviation of functions represented by DNN from their typical mean field solutions. The parameter perturbations investigated include weight sparsification (dilution) and binarization, which are commonly used in model simplification, for both ReLU and sign activation functions. We find that random networks with ReLU activation are more robust to parameter perturbations with respect to their counterparts with sign activation, which arguably is reflected in the simplicity of the functions they generate.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/13/2022

Convergence of Deep Neural Networks with General Activation Functions and Pooling

Deep neural networks, as a powerful system to represent high dimensional...
research
09/11/2020

Abstract Neural Networks

Deep Neural Networks (DNNs) are rapidly being applied to safety-critical...
research
05/24/2022

Taming the sign problem of explicitly antisymmetrized neural networks via rough activation functions

Explicit antisymmetrization of a two-layer neural network is a potential...
research
10/15/2020

Review and Comparison of Commonly Used Activation Functions for Deep Neural Networks

The primary neural networks decision-making units are activation functio...
research
06/01/2019

A mean-field limit for certain deep neural networks

Understanding deep neural networks (DNNs) is a key challenge in the theo...
research
06/04/2018

Universal Statistics of Fisher Information in Deep Neural Networks: Mean Field Approach

This study analyzes the Fisher information matrix (FIM) by applying mean...
research
02/19/2020

Span Recovery for Deep Neural Networks with Applications to Input Obfuscation

The tremendous success of deep neural networks has motivated the need to...

Please sign up or login with your details

Forgot password? Click here to reset