Coordinate Descent for MCP/SCAD Penalized Least Squares Converges Linearly

09/18/2021
by   Yuling Jiao, et al.
0

Recovering sparse signals from observed data is an important topic in signal/imaging processing, statistics and machine learning. Nonconvex penalized least squares have been attracted a lot of attentions since they enjoy nice statistical properties. Computationally, coordinate descent (CD) is a workhorse for minimizing the nonconvex penalized least squares criterion due to its simplicity and scalability. In this work, we prove the linear convergence rate to CD for solving MCP/SCAD penalized least squares problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/04/2022

Improved Pathwise Coordinate Descent for Power Penalties

Pathwise coordinate descent algorithms have been used to compute entire ...
research
04/08/2020

A novel greedy Gauss-Seidel method for solving large linear least squares problem

We present a novel greedy Gauss-Seidel method for solving large linear l...
research
11/05/2019

Penalized least squares and sign constraints with modified Newton-Raphson algorithms: application to EEG source imaging

We propose a modified Newton-Raphson (MNR) algorithm to estimate multipl...
research
12/12/2014

Expanded Alternating Optimization of Nonconvex Functions with Applications to Matrix Factorization and Penalized Regression

We propose a general technique for improving alternating optimization (A...
research
08/22/2022

Local Geometry of Nonconvex Spike Deconvolution from Low-Pass Measurements

Spike deconvolution is the problem of recovering the point sources from ...
research
09/25/2014

MIST: L0 Sparse Linear Regression with Momentum

Significant attention has been given to minimizing a penalized least squ...
research
05/16/2018

On the Convergence of the SINDy Algorithm

One way to understand time-series data is to identify the underlying dyn...

Please sign up or login with your details

Forgot password? Click here to reset