Stability and Risk Bounds of Iterative Hard Thresholding

03/17/2022
by   Xiao-Tong Yuan, et al.
7

In this paper, we analyze the generalization performance of the Iterative Hard Thresholding (IHT) algorithm widely used for sparse recovery problems. The parameter estimation and sparsity recovery consistency of IHT has long been known in compressed sensing. From the perspective of statistical learning, another fundamental question is how well the IHT estimation would predict on unseen data. This paper makes progress towards answering this open question by introducing a novel sparse generalization theory for IHT under the notion of algorithmic stability. Our theory reveals that: 1) under natural conditions on the empirical risk function over n samples of dimension p, IHT with sparsity level k enjoys an Õ(n^-1/2√(klog(n)log(p))) rate of convergence in sparse excess risk; 2) a tighter Õ(n^-1/2√(log(n))) bound can be established by imposing an additional iteration stability condition on a hypothetical IHT procedure invoked to the population risk; and 3) a fast rate of order Õ(n^-1k(log^3(n)+log(p))) can be derived for strongly convex risk function under proper strong-signal conditions. The results have been substantialized to sparse linear regression and sparse logistic regression models to demonstrate the applicability of our theory. Preliminary numerical evidence is provided to confirm our theoretical predictions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/20/2020

Generalization Bounds for High-dimensional M-estimation under Sparsity Constraint

The ℓ_0-constrained empirical risk minimization (ℓ_0-ERM) is a promising...
research
10/13/2021

Data-Time Tradeoffs for Optimal k-Thresholding Algorithms in Compressed Sensing

Optimal k-thresholding algorithms are a class of sparse signal recovery ...
research
08/27/2020

Scaled minimax optimality in high-dimensional linear regression: A non-convex algorithmic regularization approach

The question of fast convergence in the classical problem of high dimens...
research
07/07/2020

One-Bit Compressed Sensing via One-Shot Hard Thresholding

This paper concerns the problem of 1-bit compressed sensing, where the g...
research
06/25/2020

Sparse Convex Optimization via Adaptively Regularized Hard Thresholding

The goal of Sparse Convex Optimization is to optimize a convex function ...
research
01/09/2023

Sharper Analysis for Minibatch Stochastic Proximal Point Methods: Stability, Smoothness, and Deviation

The stochastic proximal point (SPP) methods have gained recent attention...
research
11/23/2015

Sparse Recovery via Partial Regularization: Models, Theory and Algorithms

In the context of sparse recovery, it is known that most of existing reg...

Please sign up or login with your details

Forgot password? Click here to reset