Non-Asymptotic Analysis of Stochastic Approximation Algorithms for Streaming Data

Motivated by the high-frequency data streams continuously generated, real-time learning is becoming increasingly important. These data streams should be processed sequentially with the property that the stream may change over time. In this streaming setting, we propose techniques for minimizing a convex objective through unbiased estimates of its gradients, commonly referred to as stochastic approximation problems. Our methods rely on stochastic approximation algorithms due to their computationally advantage as they only use the previous iterate as a parameter estimate. The reasoning includes iterate averaging that guarantees optimal statistical efficiency under classical conditions. Our non-asymptotic analysis shows accelerated convergence by selecting the learning rate according to the expected data streams. We show that the average estimate converges optimally and robustly to any data stream rate. In addition, noise reduction can be achieved by processing the data in a specific pattern, which is advantageous for large-scale machine learning. These theoretical results are illustrated for various data streams, showing the effectiveness of the proposed algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/25/2017

Stochastic Optimization from Distributed, Streaming Data in Rate-limited Networks

Motivated by machine learning applications in networks of sensors, inter...
research
07/29/2020

Stochastic approximation algorithms for superquantiles estimation

This paper is devoted to two different two-time-scale stochastic approxi...
research
05/25/2022

Learning from time-dependent streaming data with online stochastic algorithms

We study stochastic algorithms in a streaming framework, trained on samp...
research
04/16/2019

An efficient stochastic Newton algorithm for parameter estimation in logistic regressions

Logistic regression is a well-known statistical model which is commonly ...
research
05/18/2020

Scaling-up Distributed Processing of Data Streams for Machine Learning

Emerging applications of machine learning in numerous areas involve cont...
research
09/17/2018

Zap Meets Momentum: Stochastic Approximation Algorithms with Optimal Convergence Rate

There are two well known Stochastic Approximation techniques that are kn...
research
06/16/2022

Optimal Parallel Sequential Change Detection under Generalized Performance Measures

This paper considers the detection of change points in parallel data str...

Please sign up or login with your details

Forgot password? Click here to reset