Decomposable Non-Smooth Convex Optimization with Nearly-Linear Gradient Oracle Complexity

08/07/2022
by   Sally Dong, et al.
0

Many fundamental problems in machine learning can be formulated by the convex program min_θ∈ R^d ∑_i=1^nf_i(θ), where each f_i is a convex, Lipschitz function supported on a subset of d_i coordinates of θ. One common approach to this problem, exemplified by stochastic gradient descent, involves sampling one f_i term at every iteration to make progress. This approach crucially relies on a notion of uniformity across the f_i's, formally captured by their condition number. In this work, we give an algorithm that minimizes the above convex formulation to ϵ-accuracy in O(∑_i=1^n d_i log (1 /ϵ)) gradient computations, with no assumptions on the condition number. The previous best algorithm independent of the condition number is the standard cutting plane method, which requires O(nd log (1/ϵ)) gradient computations. As a corollary, we improve upon the evaluation oracle complexity for decomposable submodular minimization by Axiotis et al. (ICML 2021). Our main technical contribution is an adaptive procedure to select an f_i term at every iteration via a novel combination of cutting-plane and interior-point methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/07/2023

Convex Minimization with Integer Minima in O(n^4) Time

Given a convex function f on ℝ^n with an integer minimizer, we show how ...
research
06/16/2023

Memory-Constrained Algorithms for Convex Optimization via Recursive Cutting-Planes

We propose a family of recursive cutting-plane algorithms to solve feasi...
research
05/12/2020

Gradient-Free Methods for Saddle-Point Problem

In the paper, we generalize the approach Gasnikov et. al, 2017, which al...
research
12/24/2015

The Lovász Hinge: A Novel Convex Surrogate for Submodular Losses

Learning with non-modular losses is an important problem when sets of pr...
research
05/20/2014

Convex Optimization: Algorithms and Complexity

This monograph presents the main complexity theorems in convex optimizat...
research
03/29/2022

Efficient Convex Optimization Requires Superlinear Memory

We show that any memory-constrained, first-order algorithm which minimiz...
research
01/30/2023

Polynomial Preconditioning for Gradient Methods

We study first-order methods with preconditioning for solving structured...

Please sign up or login with your details

Forgot password? Click here to reset