Linearization Algorithms for Fully Composite Optimization

02/24/2023
by   Maria-Luiza Vladarean, et al.
0

In this paper, we study first-order algorithms for solving fully composite optimization problems over bounded sets. We treat the differentiable and non-differentiable parts of the objective separately, linearizing only the smooth components. This provides us with new generalizations of the classical and accelerated Frank-Wolfe methods, that are applicable to non-differentiable problems whenever we can access the structure of the objective. We prove global complexity bounds for our algorithms that are optimal in several settings.

READ FULL TEXT
research
05/23/2016

Accelerated Stochastic Mirror Descent Algorithms For Composite Non-strongly Convex Optimization

We consider the problem of minimizing the sum of an average function of ...
research
02/22/2022

An accelerated proximal gradient method for multiobjective optimization

Many descent methods for multiobjective optimization problems have been ...
research
06/30/2023

An Oblivious Stochastic Composite Optimization Algorithm for Eigenvalue Optimization Problems

In this work, we revisit the problem of solving large-scale semidefinite...
research
07/01/2020

MLPs to Find Extrema of Functionals

Multilayer perceptron (MLP) is a class of networks composed of multiple ...
research
05/11/2021

Frank-Wolfe Methods in Probability Space

We introduce a new class of Frank-Wolfe algorithms for minimizing differ...
research
01/26/2021

Complementary Composite Minimization, Small Gradients in General Norms, and Applications to Regression Problems

Composite minimization is a powerful framework in large-scale convex opt...
research
12/18/2012

Variational Optimization

We discuss a general technique that can be used to form a differentiable...

Please sign up or login with your details

Forgot password? Click here to reset