A semismooth Newton-proximal method of multipliers for ℓ_1-regularized convex quadratic programming

01/25/2022
by   Spyridon Pougkakiotis, et al.
0

In this paper we present a method for the solution of ℓ_1-regularized convex quadratic optimization problems. It is derived by suitably combining a proximal method of multipliers strategy with a semi-smooth Newton method. The resulting linear systems are solved using a Krylov-subspace method, accelerated by appropriate general-purpose preconditioners, which are shown to be optimal with respect to the proximal parameters. Practical efficiency is further improved by warm-starting the algorithm using a proximal alternating direction method of multipliers. We show that the method achieves global convergence under feasibility assumptions. Furthermore, under additional standard assumptions, the method can achieve global linear and local superlinear convergence. The effectiveness of the approach is numerically demonstrated on L^1-regularized PDE-constrained optimization problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/11/2016

Proximal Quasi-Newton Methods for Regularized Convex Optimization with Linear and Accelerated Sublinear Convergence Rates

In [19], a general, inexact, efficient proximal quasi-Newton algorithm f...
research
12/20/2019

A New Preconditioning Approach for an Interior Point-Proximal Method of Multipliers for Linear and Convex Quadratic Programming

In this paper, we address the efficient numerical solution of linear and...
research
05/03/2022

Proximal stabilized Interior Point Methods for quadratic programming and low-frequency-updates preconditioning techniques

In this work, in the context of Linear and Quadratic Programming, we int...
research
03/09/2018

Local Kernels that Approximate Bayesian Regularization and Proximal Operators

In this work, we broadly connect kernel-based filtering (e.g. approaches...
research
04/25/2011

Optimal impact strategies for asteroid deflection

This paper presents an analysis of optimal impact strategies to deflect ...
research
07/06/2021

SGN: Sparse Gauss-Newton for Accelerated Sensitivity Analysis

We present a sparse Gauss-Newton solver for accelerated sensitivity anal...
research
02/26/2021

Sparse Approximations with Interior Point Methods

Large-scale optimization problems that seek sparse solutions have become...

Please sign up or login with your details

Forgot password? Click here to reset