Differentiable Neural Networks with RePU Activation: with Applications to Score Estimation and Isotonic Regression

05/01/2023
by   Guohao Shen, et al.
0

We study the properties of differentiable neural networks activated by rectified power unit (RePU) functions. We show that the partial derivatives of RePU neural networks can be represented by RePUs mixed-activated networks and derive upper bounds for the complexity of the function class of derivatives of RePUs networks. We establish error bounds for simultaneously approximating C^s smooth functions and their derivatives using RePU-activated deep neural networks. Furthermore, we derive improved approximation error bounds when data has an approximate low-dimensional support, demonstrating the ability of RePU networks to mitigate the curse of dimensionality. To illustrate the usefulness of our results, we consider a deep score matching estimator (DSME) and propose a penalized deep isotonic regression (PDIR) using RePU networks. We establish non-asymptotic excess risk bounds for DSME and PDIR under the assumption that the target functions belong to a class of C^s smooth functions. We also show that PDIR has a robustness property in the sense it is consistent with vanishing penalty parameters even when the monotonicity assumption is not satisfied. Furthermore, if the data distribution is supported on an approximate low-dimensional manifold, we show that DSME and PDIR can mitigate the curse of dimensionality.

READ FULL TEXT

page 35

page 36

page 37

page 38

page 39

page 40

page 41

page 42

research
04/14/2021

Deep Nonparametric Regression on Approximately Low-dimensional Manifolds

In this paper, we study the properties of nonparametric least squares re...
research
05/01/2021

Non-asymptotic Excess Risk Bounds for Classification with Deep Convolutional Neural Networks

In this paper, we consider the problem of binary classification with a c...
research
07/21/2022

Estimation of Non-Crossing Quantile Regression Process with Deep ReQU Neural Networks

We propose a penalized nonparametric approach to estimating the quantile...
research
12/10/2020

The Representation Power of Neural Networks: Breaking the Curse of Dimensionality

In this paper, we analyze the number of neurons and training parameters ...
research
06/12/2020

On Correctness of Automatic Differentiation for Non-Differentiable Functions

Differentiation lies at the core of many machine-learning algorithms, an...
research
04/17/2023

Deep Neural Network Approximation of Composition Functions: with application to PINNs

In this paper, we focus on approximating a natural class of functions th...
research
05/23/2023

Mind the spikes: Benign overfitting of kernels and neural networks in fixed dimension

The success of over-parameterized neural networks trained to near-zero t...

Please sign up or login with your details

Forgot password? Click here to reset