Convex Regression in Multidimensions: Suboptimality of Least Squares Estimators

by   Gil Kur, et al.

The least squares estimator (LSE) is shown to be suboptimal in squared error loss in the usual nonparametric regression model with Gaussian errors for d ≥ 5 for each of the following families of functions: (i) convex functions supported on a polytope (in fixed design), (ii) bounded convex functions supported on a polytope (in random design), and (iii) convex Lipschitz functions supported on any convex domain (in random design). For each of these families, the risk of the LSE is proved to be of the order n^-2/d (up to logarithmic factors) while the minimax risk is n^-4/(d+4), for d ≥ 5. In addition, the first rate of convergence results (worst case and adaptive) for the full convex LSE are established for polytopal domains for all d ≥ 1. Some new metric entropy results for convex functions are also proved which are of independent interest.


page 1

page 2

page 3

page 4


Efficient Minimax Optimal Estimators For Multivariate Convex Regression

We study the computational aspects of the task of multivariate convex re...

Distribution-free properties of isotonic regression

It is well known that the isotonic least squares estimator is characteri...

Nonparametric Estimation for I.I.D. Paths of Fractional SDE

This paper deals with nonparametric projection estimators of the drift f...

On Least Squares Estimation under Heteroscedastic and Heavy-Tailed Errors

We consider least squares estimation in a general nonparametric regressi...

Fast rates for empirical risk minimization with cadlag losses with bounded sectional variation norm

Empirical risk minimization over sieves of the class F of cadlag functio...

Learning with Square Loss: Localization through Offset Rademacher Complexity

We consider regression with square loss and general classes of functions...

Please sign up or login with your details

Forgot password? Click here to reset