DeepAI
Log In Sign Up

Minimax Rates of Estimation for Sparse PCA in High Dimensions

02/03/2012
by   Vincent Q. Vu, et al.
0

We study sparse principal components analysis in the high-dimensional setting, where p (the number of variables) can be much larger than n (the number of observations). We prove optimal, non-asymptotic lower and upper bounds on the minimax estimation error for the leading eigenvector when it belongs to an ℓ_q ball for q ∈ [0,1]. Our bounds are sharp in p and n for all q ∈ [0, 1] over a wide class of distributions. The upper bound is obtained by analyzing the performance of ℓ_q-constrained PCA. In particular, our results provide convergence rates for ℓ_1-constrained PCA.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/02/2012

Minimax sparse principal subspace estimation in high dimensions

We study sparse principal components analysis in high dimensions, where ...
02/18/2020

Optimal Structured Principal Subspace Estimation: Metric Entropy and Minimax Rates

Driven by a wide range of applications, many principal subspace estimati...
06/13/2020

Optimal Rates for Estimation of Two-Dimensional Totally Positive Distributions

We study minimax estimation of two-dimensional totally positive distribu...
02/22/2018

Learning Without Mixing: Towards A Sharp Analysis of Linear System Identification

We prove that the ordinary least-squares (OLS) estimator attains nearly ...
03/01/2022

Adaptive nonparametric estimation in the functional linear model with functional output

In this paper, we consider a functional linear regression model, where b...
10/29/2020

Convergence of Constrained Anderson Acceleration

We prove non asymptotic linear convergence rates for the constrained And...
05/30/2019

Global empirical risk minimizers with "shape constraints" are rate optimal in general dimensions

Entropy integrals are widely used as a powerful tool to obtain upper bou...