Fast Mean Estimation with Sub-Gaussian Rates

02/06/2019
by   Yeshwanth Cherapanamjeri, et al.
0

We propose an estimator for the mean of a random vector in R^d that can be computed in time O(n^4+n^2d) for n i.i.d. samples and that has error bounds matching the sub-Gaussian case. The only assumptions we make about the data distribution are that it has finite mean and covariance; in particular, we make no assumptions about higher-order moments. Like the polynomial time estimator introduced by Hopkins, 2018, which is based on the sum-of-squares hierarchy, our estimator achieves optimal statistical efficiency in this challenging setting, but it has a significantly faster runtime and a simpler analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/19/2018

Mean Estimation with Sub-Gaussian Rates in Polynomial Time

We study polynomial time algorithms for estimating the mean of a heavy-t...
research
08/13/2019

A Fast Spectral Algorithm for Mean Estimation with Sub-Gaussian Rates

We study the algorithmic problem of estimating the mean of heavy-tailed ...
research
11/17/2020

Optimal Sub-Gaussian Mean Estimation in ℝ

We revisit the problem of estimating the mean of a real-valued distribut...
research
07/12/2020

A spectral algorithm for robust regression with subgaussian rates

We study a new linear up to quadratic time algorithm for linear regressi...
research
09/19/2018

Sub-Gaussian Mean Estimation in Polynomial Time

We study polynomial time algorithms for estimating the mean of a random ...
research
05/10/2019

Selectivity Estimation with Deep Likelihood Models

Selectivity estimation has long been grounded in statistical tools for d...
research
12/23/2019

Algorithms for Heavy-Tailed Statistics: Regression, Covariance Estimation, and Beyond

We study efficient algorithms for linear regression and covariance estim...

Please sign up or login with your details

Forgot password? Click here to reset