Approximation in shift-invariant spaces with deep ReLU neural networks
We construct deep ReLU neural networks to approximate functions in dilated shift-invariant spaces generated by a continuous function with compact support and study the approximation rates with respect to the number of neurons. The network construction is based on the bit extraction and data fitting capacity of deep neural networks. Combining with existing results of approximation from shift-invariant spaces, we are able to estimate the approximation rates of classical function spaces such as Sobolev spaces and Besov spaces. We also give lower bounds of the L^p([0,1]^d) approximation error for Sobolev spaces, which show that our construction is asymptotically optimal up to a logarithm factor.
READ FULL TEXT