An Extension of Averaged-Operator-Based Algorithms

06/12/2018
by   Miguel Simões, et al.
0

Many of the algorithms used to solve minimization problems with sparsity-inducing regularizers are generic in the sense that they do not take into account the sparsity of the solution in any particular way. However, algorithms known as semismooth Newton are able to take advantage of this sparsity to accelerate their convergence. We show how to extend these algorithms in different directions, and study the convergence of the resulting algorithms by showing that they are a particular case of an extension of the well-known Krasnosel'skiĭ--Mann scheme.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2017

A Continuum of Optimal Primal-Dual Algorithms for Convex Composite Minimization Problems with Applications to Structured Sparsity

Many statistical learning problems can be posed as minimization of a sum...
research
01/09/2020

A note on the minimization of a Tikhonov functional with ℓ^1-penalty

In this paper, we consider the minimization of a Tikhonov functional wit...
research
10/04/2016

A Generic Quasi-Newton Algorithm for Faster Gradient-Based Optimization

We propose a generic approach to accelerate gradient-based optimization ...
research
10/22/2018

On DC based Methods for Phase Retrieval

In this paper, we develop a new computational approach which is based on...
research
10/27/2015

Exclusive Sparsity Norm Minimization with Random Groups via Cone Projection

Many practical applications such as gene expression analysis, multi-task...
research
04/10/2023

Über die Anwendung des Tschebyschew-Verfahrens zum Ausbau des Weierstraß-Kerner-Verfahrens

We extend the Weierstrass-Kerner method by applying the Chebychev method...
research
06/09/2023

Extending Kernel PCA through Dualization: Sparsity, Robustness and Fast Algorithms

The goal of this paper is to revisit Kernel Principal Component Analysis...

Please sign up or login with your details

Forgot password? Click here to reset