Spectral Projected Subgradient Method for Nonsmooth Convex Optimization Problems

03/23/2022
by   Natasa Krejic, et al.
0

We consider constrained optimization problems with a nonsmooth objective function in the form of mathematical expectation. The Sample Average Approximation (SAA) is used to estimate the objective function and variable sample size strategy is employed. The proposed algorithm combines an SAA subgradient with the spectral coefficient in order to provide a suitable direction which improves the performance of the first order method as shown by numerical results. The step sizes are chosen from the predefined interval and the almost sure convergence of the method is proved under the standard assumptions in stochastic environment. To enhance the performance of the proposed algorithm, we further specify the choice of the step size by introducing an Armijo-like procedure adapted to this framework. Considering the computational cost on machine learning problems, we conclude that the line search improves the performance significantly. Numerical experiments conducted on finite sums problems also reveal that the variable sample strategy outperforms the full sample approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/25/2021

Minimizing Nonsmooth Convex Functions with Variable Accuracy

We consider unconstrained optimization problems with nonsmooth and conve...
research
10/03/2020

Learning the Step-size Policy for the Limited-Memory Broyden-Fletcher-Goldfarb-Shanno Algorithm

We consider the problem of how to learn a step-size policy for the Limit...
research
07/31/2020

LSOS: Line-search Second-Order Stochastic optimization methods

We develop a line-search second-order algorithmic framework for optimiza...
research
12/13/2022

Self-adaptive algorithms for quasiconvex programming and applications to machine learning

For solving a broad class of nonconvex programming problems on an unboun...
research
03/15/2021

Extrapolating the Arnoldi Algorithm To Improve Eigenvector Convergence

We consider extrapolation of the Arnoldi algorithm to accelerate computa...
research
07/20/2020

Sequential Quadratic Optimization for Nonlinear Equality Constrained Stochastic Optimization

Sequential quadratic optimization algorithms are proposed for solving sm...
research
06/08/2021

Using a New Nonlinear Gradient Method for Solving Large Scale Convex Optimization Problems with an Application on Arabic Medical Text

Gradient methods have applications in multiple fields, including signal ...

Please sign up or login with your details

Forgot password? Click here to reset