Projection-Free Online Optimization with Stochastic Gradient: From Convexity to Submodularity

02/22/2018
by   Lin Chen, et al.
0

Online optimization has been a successful framework for solving large-scale problems under computational constraints and partial information. Current methods for online convex optimization require either a projection or exact gradient computation at each step, both of which can be prohibitively expensive for large-scale applications. At the same time, there is a growing trend of non-convex optimization in machine learning community and a need for online methods. Continuous submodular functions, which exhibit a natural diminishing returns condition, have recently been proposed as a broad class of non-convex functions which may be efficiently optimized. Although online methods have been introduced, they suffer from similar problems. In this work, we propose Meta-Frank-Wolfe, the first online projectionfree algorithm that uses stochastic gradient estimates. The algorithm relies on a careful sampling of gradients in each round and achieves the optimal O(√(T)) adversarial regret bounds for convex and continuous submodular optimization. We also propose One-Shot Frank-Wolfe, a simpler algorithm which requires only a single stochastic gradient estimate in each round and achieves a O(T^2/3) stochastic regret bound for convex and continuous submodular optimization. We apply our methods to develop a novel "lifting" framework for the online discrete submodular maximization and also see that they outperform current state of the art techniques on an extensive set of experiments.

READ FULL TEXT
research
02/16/2018

Online Continuous Submodular Maximization

In this paper, we consider an online optimization process, where the obj...
research
08/16/2022

Online Learning for Non-monotone Submodular Maximization: From Full Information to Bandit Feedback

In this paper, we revisit the online non-monotone continuous DR-submodul...
research
09/08/2023

Online Submodular Maximization via Online Convex Optimization

We study monotone submodular maximization under general matroid constrai...
research
06/19/2023

Online Dynamic Submodular Optimization

We propose new algorithms with provable performance for online binary op...
research
11/05/2017

Stochastic Submodular Maximization: The Case of Coverage Functions

Stochastic optimization of continuous objectives is at the heart of mode...
research
10/21/2019

Efficient Projection-Free Online Methods with Stochastic Recursive Gradient

This paper focuses on projection-free methods for solving smooth Online ...
research
10/21/2019

Stochastic Recursive Gradient-Based Methods for Projection-Free Online Learning

This paper focuses on projection-free methods for solving smooth Online ...

Please sign up or login with your details

Forgot password? Click here to reset