Multivariate extensions of isotonic regression and total variation denoising via entire monotonicity and Hardy-Krause variation

03/04/2019
by   Billy Fang, et al.
0

We consider the problem of nonparametric regression when the covariate is d-dimensional, where d ≥ 1. In this paper we introduce and study two nonparametric least squares estimators (LSEs) in this setting---the entirely monotonic LSE and the constrained Hardy-Krause variation LSE. We show that these two LSEs are natural generalizations of univariate isotonic regression and univariate total variation denoising, respectively, to multiple dimensions. We discuss the characterization and computation of these two LSEs obtained from n data points. We provide a detailed study of their risk properties under the squared error loss and fixed uniform lattice design. We show that the finite sample risk of these LSEs is always bounded from above by n^-2/3 modulo logarithmic factors depending on d; thus these nonparametric LSEs avoid the curse of dimensionality to some extent. For the case of the Hardy-Krause variation LSE, we also show that logarithmic factors which increase with d are necessary in the risk upper bound by proving a minimax lower bound. Further, we illustrate that these LSEs are particularly useful in fitting rectangular piecewise constant functions. Specifically, we show that the risk of the entirely monotonic LSE is almost parametric (at most 1/n up to logarithmic factors) when the true function is well-approximable by a rectangular piecewise constant entirely monotone function with not too many constant pieces. A similar result is also shown to hold for the constrained Hardy-Krause variation LSE for a simple subclass of rectangular piecewise constant functions. We believe that the proposed LSEs yield a novel approach to estimating multivariate functions using convex optimization that avoid the curse of dimensionality to some extent.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/26/2019

Adaptive Estimation of Multivariate Piecewise Polynomials and Bounded Variation Functions by Optimal Decision Trees

Proposed by Donoho (1997), Dyadic CART is a nonparametric regression met...
research
07/22/2019

Fast rates for empirical risk minimization over càdlàg functions with bounded sectional variation norm

Empirical risk minimization over classes functions that are bounded for ...
research
02/04/2019

New Risk Bounds for 2D Total Variation Denoising

2D Total Variation Denoising (TVD) is a widely used technique for image ...
research
07/22/2019

Fast rates for empirical risk minimization with cadlag losses with bounded sectional variation norm

Empirical risk minimization over sieves of the class F of cadlag functio...
research
12/30/2022

The Voronoigram: Minimax Estimation of Bounded Variation Functions From Scattered Data

We consider the problem of estimating a multivariate function f_0 of bou...
research
06/03/2020

Convex Regression in Multidimensions: Suboptimality of Least Squares Estimators

The least squares estimator (LSE) is shown to be suboptimal in squared e...
research
08/06/2017

Interpretable Low-Dimensional Regression via Data-Adaptive Smoothing

We consider the problem of estimating a regression function in the commo...

Please sign up or login with your details

Forgot password? Click here to reset