Learning the gravitational force law and other analytic functions

by   Atish Agarwala, et al.

Large neural network models have been successful in learning functions of importance in many branches of science, including physics, chemistry and biology. Recent theoretical work has shown explicit learning bounds for wide networks and kernel methods on some simple classes of functions, but not on more complex functions which arise in practice. We extend these techniques to provide learning bounds for analytic functions on the sphere for any kernel method or equivalent infinitely-wide network with the corresponding activation function trained with SGD. We show that a wide, one-hidden layer ReLU network can learn analytic functions with a number of samples proportional to the derivative of a related function. Many functions important in the sciences are therefore efficiently learnable. As an example, we prove explicit bounds on learning the many-body gravitational force function given by Newton's law of gravitation. Our theoretical bounds suggest that very wide ReLU networks (and the corresponding NTK kernel) are better at learning analytic functions as compared to kernel learning with Gaussian kernels. We present experimental evidence that the many-body gravitational force function is easier to learn with ReLU networks as compared to networks with exponential activations.


page 1

page 2

page 3

page 4


On the approximation of functions by tanh neural networks

We derive bounds on the error, in high-order Sobolev norms, incurred in ...

Neural Splines: Fitting 3D Surfaces with Infinitely-Wide Neural Networks

We present Neural Splines, a technique for 3D surface reconstruction tha...

Of Kernels and Queues: when network calculus meets analytic combinatorics

Stochastic network calculus is a tool for computing error bounds on the ...

Invariance of Weight Distributions in Rectified MLPs

An interesting approach to analyzing and developing tools for neural net...

Exponential ReLU Neural Network Approximation Rates for Point and Edge Singularities

We prove exponential expressivity with stable ReLU Neural Networks (ReLU...

Learning Symbolic Physics with Graph Networks

We introduce an approach for imposing physically motivated inductive bia...

Analytic Marching: An Analytic Meshing Solution from Deep Implicit Surface Networks

This paper studies a problem of learning surface mesh via implicit funct...

Please sign up or login with your details

Forgot password? Click here to reset