Composite Self-Concordant Minimization

08/13/2013
by   Quoc Tran-Dinh, et al.
0

We propose a variable metric framework for minimizing the sum of a self-concordant function and a possibly non-smooth convex function, endowed with an easily computable proximal operator. We theoretically establish the convergence of our framework without relying on the usual Lipschitz gradient assumption on the smooth part. An important highlight of our work is a new set of analytic step-size selection and correction procedures based on the structure of the problem. We describe concrete algorithmic instances of our framework for several interesting applications and demonstrate them numerically on both synthetic and real data.

READ FULL TEXT
research
02/04/2015

Composite convex minimization involving self-concordant-like cost functions

The self-concordant-like property of a smooth convex function is a new a...
research
01/08/2013

A proximal Newton framework for composite minimization: Graph learning without Cholesky decompositions and matrix inversions

We propose an algorithmic framework for convex minimization problems of ...
research
09/04/2023

Self-concordant Smoothing for Convex Composite Optimization

We introduce the notion of self-concordant smoothing for minimizing the ...
research
02/12/2021

Proximal and Federated Random Reshuffling

Random Reshuffling (RR), also known as Stochastic Gradient Descent (SGD)...
research
05/13/2014

Scalable sparse covariance estimation via self-concordance

We consider the class of convex minimization problems, composed of a sel...
research
06/20/2017

First Order Methods beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems

We focus on nonconvex and nonsmooth minimization problems with a composi...
research
11/03/2022

Proximal Subgradient Norm Minimization of ISTA and FISTA

For first-order smooth optimization, the research on the acceleration ph...

Please sign up or login with your details

Forgot password? Click here to reset