A New Perspective on Robust M-Estimation: Finite Sample Theory and Applications to Dependence-Adjusted Multiple Testing

11/15/2017
by   Wen-Xin Zhou, et al.
0

Heavy-tailed errors impair the accuracy of the least squares estimate, which can be spoiled by a single grossly outlying observation. As argued in the seminal work of Peter Huber in 1973 [ Ann. Statist. 1 (1973) 799--821], robust alternatives to the method of least squares are sorely needed. To achieve robustness against heavy-tailed sampling distributions, we revisit the Huber estimator from a new perspective by letting the tuning parameter involved diverge with the sample size. In this paper, we develop nonasymptotic concentration results for such an adaptive Huber estimator, namely, the Huber estimator with the tuning parameter adapted to sample size, dimension, and the variance of the noise. Specifically, we obtain a sub-Gaussian-type deviation inequality and a nonasymptotic Bahadur representation when noise variables only have finite second moments. The nonasymptotic results further yield two conventional normal approximation results that are of independent interest, the Berry-Esseen inequality and Cramér-type moderate deviation. As an important application to large-scale simultaneous inference, we apply these robust normal approximation results to analyze a dependence-adjusted multiple testing procedure for moderately heavy-tailed data. It is shown that the robust dependence-adjusted procedure asymptotically controls the overall false discovery proportion at the nominal level under mild moment conditions. Thorough numerical results on both simulated and real datasets are also provided to back up our theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/15/2017

FARM-Test: Factor-Adjusted Robust Multiple Testing with False Discovery Control

Large-scale multiple testing with correlated and heavy-tailed data arise...
research
11/22/2022

Robust High-dimensional Tuning Free Multiple Testing

A stylized feature of high-dimensional data is that many variables have ...
research
03/18/2019

Robust Inference via Multiplier Bootstrap

This paper investigates the theoretical underpinnings of two fundamental...
research
03/14/2023

Robust Multiple Testing under High-dimensional Dynamic Factor Model

Large-scale multiple testing under static factor models is commonly used...
research
11/05/2018

User-Friendly Covariance Estimation for Heavy-Tailed Distributions: A Survey and Recent Results

We offer a survey of selected recent results on covariance estimation fo...
research
10/15/2022

On Catoni's M-Estimation

Catoni proposed a robust M-estimator and gave the deviation inequality f...
research
06/30/2021

Whiteout: when do fixed-X knockoffs fail?

A core strength of knockoff methods is their virtually limitless customi...

Please sign up or login with your details

Forgot password? Click here to reset