Robust Large Margin Deep Neural Networks

05/26/2016
by   Jure Sokolic, et al.
0

The generalization error of deep neural networks via their classification margin is studied in this work. Our approach is based on the Jacobian matrix of a deep neural network and can be applied to networks with arbitrary non-linearities and pooling layers, and to networks with different architectures such as feed forward networks and residual networks. Our analysis leads to the conclusion that a bounded spectral norm of the network's Jacobian matrix in the neighbourhood of the training samples is crucial for a deep neural network of arbitrary depth and width to generalize well. This is a significant improvement over the current bounds in the literature, which imply that the generalization error grows with either the width or the depth of the network. Moreover, it shows that the recently proposed batch normalization and weight normalization re-parametrizations enjoy good generalization properties, and leads to a novel network regularizer based on the network's Jacobian matrix. The analysis is supported with experimental results on the MNIST, CIFAR-10, LaRED and ImageNet datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2018

On Tighter Generalization Bound for Deep Neural Networks: CNNs, ResNets, and Beyond

Our paper proposes a generalization error bound for a general family of ...
research
10/03/2018

Understanding Weight Normalized Deep Neural Networks with Rectified Linear Units

This paper presents a general framework for norm-based capacity control ...
research
06/29/2023

Spectral Batch Normalization: Normalization in the Frequency Domain

Regularization is a set of techniques that are used to improve the gener...
research
06/16/2016

On the Expressive Power of Deep Neural Networks

We propose a new approach to the problem of neural network expressivity,...
research
01/03/2020

Feature-Robustness, Flatness and Generalization Error for Deep Neural Networks

The performance of deep neural networks is often attributed to their aut...
research
11/03/2018

Radius-margin bounds for deep neural networks

Explaining the unreasonable effectiveness of deep learning has eluded re...
research
01/31/2019

Effect of Various Regularizers on Model Complexities of Neural Networks in Presence of Input Noise

Deep neural networks are over-parameterized, which implies that the numb...

Please sign up or login with your details

Forgot password? Click here to reset