Log In Sign Up

Input Hessian Regularization of Neural Networks

by   Waleed Mustafa, et al.

Regularizing the input gradient has shown to be effective in promoting the robustness of neural networks. The regularization of the input's Hessian is therefore a natural next step. A key challenge here is the computational complexity. Computing the Hessian of inputs is computationally infeasible. In this paper we propose an efficient algorithm to train deep neural networks with Hessian operator-norm regularization. We analyze the approach theoretically and prove that the Hessian operator norm relates to the ability of a neural network to withstand an adversarial attack. We give a preliminary experimental evaluation on the MNIST and FMNIST datasets, which demonstrates that the new regularizer can, indeed, be feasible and, furthermore, that it increases the robustness of neural networks over input gradient regularization.


page 1

page 2

page 3

page 4


Generalizing and Improving Jacobian and Hessian Regularization

Jacobian and Hessian regularization aim to reduce the magnitude of the f...

A Hessian Based Complexity Measure for Deep Networks

Deep (neural) networks have been applied productively in a wide range of...

How Does Sharpness-Aware Minimization Minimize Sharpness?

Sharpness-Aware Minimization (SAM) is a highly effective regularization ...

Accelerating Hessian-free optimization for deep neural networks by implicit preconditioning and sampling

Hessian-free training has become a popular parallel second or- der optim...

Delaunay-Triangulation-Based Learning with Hessian Total-Variation Regularization

Regression is one of the core problems tackled in supervised learning. R...

Regularizing Deep Neural Networks with Stochastic Estimators of Hessian Trace

In this paper we develop a novel regularization method for deep neural n...

Exact Spectral Norm Regularization for Neural Networks

We pursue a line of research that seeks to regularize the spectral norm ...