MIST: L0 Sparse Linear Regression with Momentum

09/25/2014
by   Goran Marjanovic, et al.
0

Significant attention has been given to minimizing a penalized least squares criterion for estimating sparse solutions to large linear systems of equations. The penalty is responsible for inducing sparsity and the natural choice is the so-called l_0 norm. In this paper we develop a Momentumized Iterative Shrinkage Thresholding (MIST) algorithm for minimizing the resulting non-convex criterion and prove its convergence to a local minimizer. Simulations on large data sets show superior performance of the proposed method to other methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/05/2014

L0 Sparse Inverse Covariance Estimation

Recently, there has been focus on penalized log-likelihood covariance es...
research
02/22/2013

Sparse Signal Estimation by Maximally Sparse Convex Optimization

This paper addresses the problem of sparsity penalized least squares for...
research
03/18/2013

A General Iterative Shrinkage and Thresholding Algorithm for Non-convex Regularized Optimization Problems

Non-convex sparsity-inducing penalties have recently received considerab...
research
02/24/2018

Semi-Smooth Newton Algorithm for Non-Convex Penalized Linear Regression

Both the smoothly clipped absolute deviation (SCAD) and the minimax conc...
research
09/18/2021

Coordinate Descent for MCP/SCAD Penalized Least Squares Converges Linearly

Recovering sparse signals from observed data is an important topic in si...
research
11/17/2021

A Generalized Proportionate-Type Normalized Subband Adaptive Filter

We show that a new design criterion, i.e., the least squares on subband ...
research
08/08/2020

Representation Learning via Cauchy Convolutional Sparse Coding

In representation learning, Convolutional Sparse Coding (CSC) enables un...

Please sign up or login with your details

Forgot password? Click here to reset