Dynamic Online Gradient Descent with Improved Query Complexity: A Theoretical Revisit

12/26/2018
by   Yawei Zhao, et al.
0

We provide a new theoretical analysis framework to investigate online gradient descent in the dynamic environment. Comparing with the previous work, the new framework recovers the state-of-the-art dynamic regret, but does not require extra gradient queries for every iteration. Specifically, when functions are α strongly convex and β smooth, to achieve the state-of-the-art dynamic regret, the previous work requires O(κ) with κ = β/α queries of gradients at every iteration. But, our framework shows that the query complexity can be improved to be O(1), which does not depend on κ. The improvement is significant for ill-conditioned problems because that their objective function usually has a large κ.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset