DeepAI AI Chat
Log In Sign Up

Deep Quantile Regression: Mitigating the Curse of Dimensionality Through Composition

by   Guohao Shen, et al.

This paper considers the problem of nonparametric quantile regression under the assumption that the target conditional quantile function is a composition of a sequence of low-dimensional functions. We study the nonparametric quantile regression estimator using deep neural networks to approximate the target conditional quantile function. For convenience, we shall refer to such an estimator as a deep quantile regression (DQR) estimator. We show that the DQR estimator achieves the nonparametric optimal convergence rate up to a logarithmic factor determined by the intrinsic dimension of the underlying compositional structure of the conditional quantile function, not the ambient dimension of the predictor. Therefore, DQR is able to mitigate the curse of dimensionality under the assumption that the conditional quantile function has a compositional structure. To establish these results, we analyze the approximation error of a composite function by neural networks and show that the error rate only depends on the dimensions of the component functions. We apply our general results to several important statistical models often used in mitigating the curse of dimensionality, including the single index, the additive, the projection pursuit, the univariate composite, and the generalized hierarchical interaction models. We explicitly describe the prefactors in the error bounds in terms of the dimensionality of the data and show that the prefactors depends on the dimensionality linearly or quadratically in these models. We also conduct extensive numerical experiments to evaluate the effectiveness of DQR and demonstrate that it outperforms a kernel-based method for nonparametric quantile regression.


page 1

page 2

page 3

page 4


Deep Nonparametric Regression on Approximately Low-dimensional Manifolds

In this paper, we study the properties of nonparametric least squares re...

How Wide Convolutional Neural Networks Learn Hierarchical Tasks

Despite their success, understanding how convolutional neural networks (...

Deep Neural Networks for Nonparametric Interaction Models with Diverging Dimension

Deep neural networks have achieved tremendous success due to their repre...

Estimation of Non-Crossing Quantile Regression Process with Deep ReQU Neural Networks

We propose a penalized nonparametric approach to estimating the quantile...

Nonparametric Estimation of Truncated Conditional Expectation Functions

Truncated conditional expectation functions are objects of interest in a...

Nonparametric regression for repeated measurements with deep neural networks

Analysis of repeated measurements for a sample of subjects has been inte...

Nonparametric Quantile Regression: Non-Crossing Constraints and Conformal Prediction

We propose a nonparametric quantile regression method using deep neural ...