
Mean Estimation with SubGaussian Rates in Polynomial Time
We study polynomial time algorithms for estimating the mean of a heavyt...
read it

A spectral algorithm for robust regression with subgaussian rates
We study a new linear up to quadratic time algorithm for linear regressi...
read it

SubGaussian Mean Estimation in Polynomial Time
We study polynomial time algorithms for estimating the mean of a random ...
read it

A Fast Spectral Algorithm for Mean Estimation with SubGaussian Rates
We study the algorithmic problem of estimating the mean of heavytailed ...
read it

How Hard Is Robust Mean Estimation?
Robust mean estimation is the problem of estimating the mean μ∈R^d of a ...
read it

Complete Classification of Generalized SanthaVazirani Sources
Let F be a finite alphabet and D be a finite set of distributions over F...
read it

Inference in HighDimensional Linear Regression via Lattice Basis Reduction and Integer Relation Detection
We focus on the highdimensional linear regression problem, where the al...
read it
Algorithms for HeavyTailed Statistics: Regression, Covariance Estimation, and Beyond
We study efficient algorithms for linear regression and covariance estimation in the absence of Gaussian assumptions on the underlying distributions of samples, making assumptions instead about only finitelymany moments. We focus on how many samples are needed to do estimation and regression with high accuracy and exponentiallygood success probability. For covariance estimation, linear regression, and several other problems, estimators have recently been constructed with sample complexities and rates of error matching what is possible when the underlying distribution is Gaussian, but algorithms for these estimators require exponential time. We narrow the gap between the Gaussian and heavytailed settings for polynomialtime estimators with: 1. A polynomialtime estimator which takes n samples from a random vector X ∈ R^d with covariance Σ and produces Σ̂ such that in spectral norm Σ̂  Σ_2 ≤Õ(d^3/4/√(n)) w.p. 12^d. The informationtheoretically optimal error bound is Õ(√(d/n)); previous approaches to polynomialtime algorithms were stuck at Õ(d/√(n)). 2. A polynomialtime algorithm which takes n samples (X_i,Y_i) where Y_i = 〈 u,X_i 〉 + ε_i and produces û such that the loss u  û^2 ≤ O(d/n) w.p. 12^d for any n ≥ d^3/2log(d)^O(1). This (informationtheoretically optimal) error is achieved by inefficient algorithms for any n ≫ d; previous polynomialtime algorithms suffer loss Ω(d^2/n) and require n ≫ d^2. Our algorithms use degree8 sumofsquares semidefinite programs. We offer preliminary evidence that improving these rates of error in polynomial time is not possible in the median of means framework our algorithms employ.
READ FULL TEXT
Comments
There are no comments yet.