Exploiting the Curvature of Feasible Sets for Faster Projection-Free Online Learning
In this paper, we develop new efficient projection-free algorithms for Online Convex Optimization (OCO). Online Gradient Descent (OGD) is an example of a classical OCO algorithm that guarantees the optimal O(√(T)) regret bound. However, OGD and other projection-based OCO algorithms need to perform a Euclidean projection onto the feasible set 𝒞⊂ℝ^d whenever their iterates step outside 𝒞. For various sets of interests, this projection step can be computationally costly, especially when the ambient dimension is large. This has motivated the development of projection-free OCO algorithms that swap Euclidean projections for often much cheaper operations such as Linear Optimization (LO). However, state-of-the-art LO-based algorithms only achieve a suboptimal O(T^3/4) regret for general OCO. In this paper, we leverage recent results in parameter-free Online Learning, and develop an OCO algorithm that makes two calls to an LO Oracle per round and achieves the near-optimal O(√(T)) regret whenever the feasible set is strongly convex. We also present an algorithm for general convex sets that makes O(d) expected number of calls to an LO Oracle per round and guarantees a O(T^2/3) regret, improving on the previous best O(T^3/4). We achieve the latter by approximating any convex set 𝒞 by a strongly convex one, where LO can be performed using O(d) expected number of calls to an LO Oracle for 𝒞.
READ FULL TEXT