The Proximal Method of Multipliers for a Class of Nonsmooth Convex Optimization

01/12/2020
by   Tomoya Takeuchi, et al.
0

This paper develops the proximal method of multipliers for a class of nonsmooth convex optimization. The method generates a sequence of minimization problems (subproblems). We show that the sequence of approximations to the solutions of the subproblems converges to a saddle point of the Lagrangian even if the original optimization problem may possess multiple solutions. The augmented Lagrangian due to Fortin appears in the subproblem. The remarkable property of the augmented Lagrangian over the standard Lagrangian is that it is always differentiable, and it is often semismoothly differentiable. This fact allows us to employ a nonsmooth Newton method for computing an approximation to the subproblem. The proximal term serves as the regularization of the objective function and guarantees the solvability of the Newton system without assuming strong convexity on the objective function. We exploit the theory of the nonsmooth Newton method to provide a rigorous proof for the global convergence of the proposed algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/02/2022

Accelerated first-order methods for convex optimization with locally Lipschitz continuous gradient

In this paper we develop accelerated first-order methods for convex opti...
research
09/05/2017

A second order primal-dual method for nonsmooth convex composite optimization

We develop a second order primal-dual method for optimization problems i...
research
08/25/2021

A New Insight on Augmented Lagrangian Method and Its Extensions

Motivated by the recent work [He-Yuan, Balanced Augmented Lagrangian Met...
research
01/13/2022

Consistent Approximations in Composite Optimization

Approximations of optimization problems arise in computational procedure...
research
06/27/2014

Proximal Quasi-Newton for Computationally Intensive L1-regularized M-estimators

We consider the class of optimization problems arising from computationa...
research
09/30/2021

A Fast Robust Numerical Continuation Solver to a Two-Dimensional Spectral Estimation Problem

This paper presents a fast algorithm to solve a spectral estimation prob...
research
10/25/2022

Faster Projection-Free Augmented Lagrangian Methods via Weak Proximal Oracle

This paper considers a convex composite optimization problem with affine...

Please sign up or login with your details

Forgot password? Click here to reset