Log In Sign Up

Minimax Rates of Estimation for Sparse PCA in High Dimensions

by   Vincent Q. Vu, et al.

We study sparse principal components analysis in the high-dimensional setting, where p (the number of variables) can be much larger than n (the number of observations). We prove optimal, non-asymptotic lower and upper bounds on the minimax estimation error for the leading eigenvector when it belongs to an ℓ_q ball for q ∈ [0,1]. Our bounds are sharp in p and n for all q ∈ [0, 1] over a wide class of distributions. The upper bound is obtained by analyzing the performance of ℓ_q-constrained PCA. In particular, our results provide convergence rates for ℓ_1-constrained PCA.


page 1

page 2

page 3

page 4


Minimax sparse principal subspace estimation in high dimensions

We study sparse principal components analysis in high dimensions, where ...

Optimal Structured Principal Subspace Estimation: Metric Entropy and Minimax Rates

Driven by a wide range of applications, many principal subspace estimati...

Optimal Rates for Estimation of Two-Dimensional Totally Positive Distributions

We study minimax estimation of two-dimensional totally positive distribu...

Learning Without Mixing: Towards A Sharp Analysis of Linear System Identification

We prove that the ordinary least-squares (OLS) estimator attains nearly ...

Adaptive nonparametric estimation in the functional linear model with functional output

In this paper, we consider a functional linear regression model, where b...

Convergence of Constrained Anderson Acceleration

We prove non asymptotic linear convergence rates for the constrained And...

Global empirical risk minimizers with "shape constraints" are rate optimal in general dimensions

Entropy integrals are widely used as a powerful tool to obtain upper bou...