Between hard and soft thresholding: optimal iterative thresholding algorithms

04/24/2018
by   Haoyang Liu, et al.
0

Iterative thresholding algorithms seek to optimize a differentiable objective function over a sparsity or rank constraint by alternating between gradient steps that reduce the objective, and thresholding steps that enforce the constraint. This work examines the choice of the thresholding operator, and asks whether it is possible to achieve stronger guarantees than what is possible with hard thresholding. We develop the notion of relative concavity of a thresholding operator, a quantity that characterizes the convergence performance of any thresholding operator on the target optimization problem. Surprisingly, we find that commonly used thresholding operators, such as hard thresholding and soft thresholding, are suboptimal in terms of convergence guarantees. Instead, a general class of thresholding operators, lying between hard thresholding and soft thresholding, is shown to be optimal with the strongest possible convergence guarantee among all thresholding operators. Examples of this general class includes ℓ_q thresholding with appropriate choices of q, and a newly defined reciprocal thresholding operator. As a byproduct of the improved convergence guarantee, these new thresholding operators improve on the best known upper bound for prediction error of both iterative hard thresholding and Lasso in terms of the dependence on condition number in the setting of sparse linear regression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/25/2020

Sparse Convex Optimization via Adaptively Regularized Hard Thresholding

The goal of Sparse Convex Optimization is to optimize a convex function ...
research
09/12/2023

Optimization Guarantees of Unfolded ISTA and ADMM Networks With Smooth Soft-Thresholding

Solving linear inverse problems plays a crucial role in numerous applica...
research
04/11/2022

Iterative Hard Thresholding with Adaptive Regularization: Sparser Solutions Without Sacrificing Runtime

We propose a simple modification to the iterative hard thresholding (IHT...
research
04/20/2021

Bridging between soft and hard thresholding by scaling

In this article, we developed and analyzed a thresholding method in whic...
research
11/21/2021

A Pseudo-Inverse for Nonlinear Operators

The Moore-Penrose inverse is widely used in physics, statistics and vari...
research
04/11/2012

Least Absolute Gradient Selector: Statistical Regression via Pseudo-Hard Thresholding

Variable selection in linear models plays a pivotal role in modern stati...
research
08/08/2020

Representation Learning via Cauchy Convolutional Sparse Coding

In representation learning, Convolutional Sparse Coding (CSC) enables un...

Please sign up or login with your details

Forgot password? Click here to reset