Dual Iterative Hard Thresholding: From Non-convex Sparse Minimization to Non-smooth Concave Maximization

03/01/2017
by   Bo Liu, et al.
0

Iterative Hard Thresholding (IHT) is a class of projected gradient descent methods for optimizing sparsity-constrained minimization models, with the best known efficiency and scalability in practice. As far as we know, the existing IHT-style methods are designed for sparse minimization in primal form. It remains open to explore duality theory and algorithms in such a non-convex and NP-hard problem setting. In this paper, we bridge this gap by establishing a duality theory for sparsity-constrained minimization with ℓ_2-regularized loss function and proposing an IHT-style algorithm for dual maximization. Our sparse duality theory provides a set of sufficient and necessary conditions under which the original NP-hard/non-convex problem can be equivalently solved in a dual formulation. The proposed dual IHT algorithm is a super-gradient method for maximizing the non-smooth dual objective. An interesting finding is that the sparse recovery performance of dual IHT is invariant to the Restricted Isometry Property (RIP), which is required by virtually all the existing primal IHT algorithms without sparsity relaxation. Moreover, a stochastic variant of dual IHT is proposed for large-scale stochastic optimization. Numerical results demonstrate the superiority of dual IHT algorithms to the state-of-the-art primal IHT-style algorithms in model estimation accuracy and computational efficiency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/27/2012

Duality between subgradient and conditional gradient methods

Given a convex optimization problem and its dual, there are many possibl...
research
11/30/2017

Improved Linear Embeddings via Lagrange Duality

Near isometric orthogonal embeddings to lower dimensions are a fundament...
research
06/12/2020

Fast Objective and Duality Gap Convergence for Non-convex Strongly-concave Min-max Problems

This paper focuses on stochastic methods for solving smooth non-convex s...
research
07/05/2022

Best Subset Selection with Efficient Primal-Dual Algorithm

Best subset selection is considered the `gold standard' for many sparse ...
research
01/20/2020

Generalization Bounds for High-dimensional M-estimation under Sparsity Constraint

The ℓ_0-constrained empirical risk minimization (ℓ_0-ERM) is a promising...
research
01/29/2022

Learning Stochastic Graph Neural Networks with Constrained Variance

Stochastic graph neural networks (SGNNs) are information processing arch...
research
10/16/2012

Efficient MRF Energy Minimization via Adaptive Diminishing Smoothing

We consider the linear programming relaxation of an energy minimization ...

Please sign up or login with your details

Forgot password? Click here to reset