Projection-free Online Exp-concave Optimization

02/09/2023
by   Dan Garber, et al.
0

We consider the setting of online convex optimization (OCO) with exp-concave losses. The best regret bound known for this setting is O(nlogT), where n is the dimension and T is the number of prediction rounds (treating all other quantities as constants and assuming T is sufficiently large), and is attainable via the well-known Online Newton Step algorithm (ONS). However, ONS requires on each iteration to compute a projection (according to some matrix-induced norm) onto the feasible convex set, which is often computationally prohibitive in high-dimensional settings and when the feasible set admits a non-trivial structure. In this work we consider projection-free online algorithms for exp-concave and smooth losses, where by projection-free we refer to algorithms that rely only on the availability of a linear optimization oracle (LOO) for the feasible set, which in many applications of interest admits much more efficient implementations than a projection oracle. We present an LOO-based ONS-style algorithm, which using overall O(T) calls to a LOO, guarantees in worst case regret bounded by O(n^2/3T^2/3) (ignoring all quantities except for n,T). However, our algorithm is most interesting in an important and plausible low-dimensional data scenario: if the gradients (approximately) span a subspace of dimension at most ρ, ρ << n, the regret bound improves to O(ρ^2/3T^2/3), and by applying standard deterministic sketching techniques, both the space and average additional per-iteration runtime requirements are only O(ρn) (instead of O(n^2)). This improves upon recently proposed LOO-based algorithms for OCO which, while having the same state-of-the-art dependence on the horizon T, suffer from regret/oracle complexity that scales with √(n) or worse.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/08/2019

Improved Regret Bounds for Projection-free Bandit Convex Optimization

We revisit the challenge of designing online algorithms for the bandit c...
research
02/09/2022

New Projection-free Algorithms for Online Convex Optimization with Adaptive Regret Guarantees

We present new efficient projection-free algorithms for online convex op...
research
05/23/2022

Exploiting the Curvature of Feasible Sets for Faster Projection-Free Online Learning

In this paper, we develop new efficient projection-free algorithms for O...
research
05/30/2023

On Riemannian Projection-free Online Learning

The projection operation is a critical component in a wide range of opti...
research
11/02/2022

Quasi-Newton Steps for Efficient Online Exp-Concave Optimization

The aim of this paper is to design computationally-efficient and optimal...
research
02/03/2023

Pseudonorm Approachability and Applications to Regret Minimization

Blackwell's celebrated approachability theory provides a general framewo...
research
06/11/2013

DISCOMAX: A Proximity-Preserving Distance Correlation Maximization Algorithm

In a regression setting we propose algorithms that reduce the dimensiona...

Please sign up or login with your details

Forgot password? Click here to reset