Global Convergence of Sobolev Training for Overparametrized Neural Networks

06/14/2020
by   Jorio Cocola, et al.
0

Sobolev loss is used when training a network to approximate the values and derivatives of a target function at a prescribed set of input points. Recent works have demonstrated its successful applications in various tasks such as distillation or synthetic gradient prediction. In this work we prove that an overparametrized two-layer relu neural network trained on the Sobolev loss with gradient flow from random initialization can fit any given function values and any given directional derivatives, under a separation condition on the input data.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset