Dynamic Online Gradient Descent with Improved Query Complexity: A Theoretical Revisit

12/26/2018
by   Yawei Zhao, et al.
0

We provide a new theoretical analysis framework to investigate online gradient descent in the dynamic environment. Comparing with the previous work, the new framework recovers the state-of-the-art dynamic regret, but does not require extra gradient queries for every iteration. Specifically, when functions are α strongly convex and β smooth, to achieve the state-of-the-art dynamic regret, the previous work requires O(κ) with κ = β/α queries of gradients at every iteration. But, our framework shows that the query complexity can be improved to be O(1), which does not depend on κ. The improvement is significant for ill-conditioned problems because that their objective function usually has a large κ.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/10/2020

Improved Analysis for Dynamic Regret of Strongly Convex and Smooth Functions

In this paper, we present an improved analysis for dynamic regret of str...
research
02/25/2010

Less Regret via Online Conditioning

We analyze and evaluate an online gradient descent algorithm with adapti...
research
11/28/2019

Understand Dynamic Regret with Switching Cost for Online Decision Making

As a metric to measure the performance of an online method, dynamic regr...
research
05/20/2016

Adversarial Delays in Online Strongly-Convex Optimization

We consider the problem of strongly-convex online optimization in presen...
research
06/19/2023

Online Dynamic Submodular Optimization

We propose new algorithms with provable performance for online binary op...
research
02/23/2022

Globally Convergent Policy Search over Dynamic Filters for Output Estimation

We introduce the first direct policy search algorithm which provably con...
research
03/22/2021

Efficient Processing of k-regret Minimization Queries with Theoretical Guarantees

Assisting end users to identify desired results from a large dataset is ...

Please sign up or login with your details

Forgot password? Click here to reset