A Homotopic Method to Solve the Lasso Problems with an Improved Upper Bound of Convergence Rate

10/26/2020
by   Yujie Zhao, et al.
0

In optimization, it is known that when the objective functions are strictly convex and well-conditioned, gradient based approaches can be extremely effective, e.g., achieving the exponential rate in convergence. On the other hand, the existing Lasso-type of estimator in general cannot achieve the optimal rate due to the undesirable behavior of the absolute function at the origin. A homotopic method is to use a sequence of surrogate functions to approximate the ℓ_1 penalty that is used in the Lasso-type of estimators. The surrogate functions will converge to the ℓ_1 penalty in the Lasso estimator. At the same time, each surrogate function is strictly convex, which enables provable faster numerical rate of convergence. In this paper, we demonstrate that by meticulously defining the surrogate functions, one can prove faster numerical convergence rate than any existing methods in computing for the Lasso-type of estimators. Namely, the state-of-the-art algorithms can only guarantee O(1/ϵ) or O(1/√(ϵ)) convergence rates, while we can prove an O([log(1/ϵ)]^2) for the newly proposed algorithm. Our numerical simulations show that the new algorithm also performs better empirically.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/14/2017

Stochastic Strictly Contractive Peaceman-Rachford Splitting Method

In this paper, we propose a couple of new Stochastic Strictly Contractiv...
research
08/06/2020

Exact Convergence Rate Analysis of the Independent Metropolis-Hastings Algorithms

A well-known difficult problem regarding Metropolis-Hastings algorithms ...
research
05/22/2022

The Selectively Adaptive Lasso

Machine learning regression methods allow estimation of functions withou...
research
04/13/2020

Robust estimation with Lasso when outputs are adversarially contaminated

We consider robust estimation when outputs are adversarially contaminate...
research
01/31/2020

Convergence rate analysis and improved iterations for numerical radius computation

We analyze existing methods for computing the numerical radius and intro...
research
02/13/2012

Sparse Matrix Inversion with Scaled Lasso

We propose a new method of learning a sparse nonnegative-definite target...
research
02/11/2022

Fast and Robust Sparsity Learning over Networks: A Decentralized Surrogate Median Regression Approach

Decentralized sparsity learning has attracted a significant amount of at...

Please sign up or login with your details

Forgot password? Click here to reset