Approximation of functions with one-bit neural networks

12/16/2021
by   C. Sinan Güntürk, et al.
0

This paper examines the approximation capabilities of coarsely quantized neural networks – those whose parameters are selected from a small set of allowable values. We show that any smooth multivariate function can be arbitrarily well approximated by an appropriate coarsely quantized neural network and provide a quantitative approximation rate. For the quadratic activation, this can be done with only a one-bit alphabet; for the ReLU activation, we use a three-bit alphabet. The main theorems rely on important properties of Bernstein polynomials. We prove new results on approximation of functions with Bernstein polynomials, noise-shaping quantization on the Bernstein basis, and implementation of the Bernstein polynomials by coarsely quantized neural networks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset