Continuation Path with Linear Convergence Rate

12/09/2021
by   Eugene Ndiaye, et al.
0

Path-following algorithms are frequently used in composite optimization problems where a series of subproblems, with varying regularization hyperparameters, are solved sequentially. By reusing the previous solutions as initialization, better convergence speeds have been observed numerically. This makes it a rather useful heuristic to speed up the execution of optimization algorithms in machine learning. We present a primal dual analysis of the path-following algorithm and explore how to design its hyperparameters as well as determining how accurately each subproblem should be solved to guarantee a linear convergence rate on a target problem. Furthermore, considering optimization with a sparsity-inducing penalty, we analyze the change of the active sets with respect to the regularization parameter. The latter can then be adaptively calibrated to finely determine the number of features that will be selected along the solution path. This leads to simple heuristics for calibrating hyperparameters of active set approaches to reduce their complexity and improve their execution time.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/15/2017

Global convergence rates of augmented Lagrangian methods for constrained convex programming

Augmented Lagrangian method (ALM) has been popularly used for solving co...
research
02/27/2020

Optimization of Graph Total Variation via Active-Set-based Combinatorial Reconditioning

Structured convex optimization on weighted graphs finds numerous applica...
research
05/04/2020

Cost Effective Optimization for Cost-related Hyperparameters

The increasing demand for democratizing machine learning algorithms for ...
research
07/20/2023

Convergence of Adam for Non-convex Objectives: Relaxed Hyperparameters and Non-ergodic Case

Adam is a commonly used stochastic optimization algorithm in machine lea...
research
10/04/2013

A Primal Dual Active Set Algorithm for a Class of Nonconvex Sparsity Optimization

In this paper, we consider the problem of recovering a sparse vector fro...
research
08/18/2023

Baird Counterexample Is Solved: with an example of How to Debug a Two-time-scale Algorithm

Baird counterexample was proposed by Leemon Baird in 1995, first used to...
research
03/28/2018

On the Algorithmic Power of Spiking Neural Networks

Spiking Neural Networks (SNN) are mathematical models in neuroscience to...

Please sign up or login with your details

Forgot password? Click here to reset