Dimensionality Reduction for Stationary Time Series via Stochastic Nonconvex Optimization

03/06/2018
by   Minshuo Chen, et al.
0

Stochastic optimization naturally arises in machine learning. Efficient algorithms with provable guarantees, however, are still largely missing, when the objective function is nonconvex and the data points are dependent. This paper studies this fundamental challenge through a streaming PCA problem for stationary time series data. Specifically, our goal is to estimate the principle component of time series data with respect to the covariance matrix of the stationary distribution. Computationally, we propose a variant of Oja's algorithm combined with downsampling to control the bias of the stochastic gradient caused by the data dependency. Theoretically, we quantify the uncertainty of our proposed stochastic algorithm based on diffusion approximations. This allows us to prove the global convergence in terms of the continuous time limiting solution trajectory and further implies near optimal sample complexity. Numerical experiments are provided to support our analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/04/2018

Towards Understanding Acceleration Tradeoff between Momentum and Asynchrony in Nonconvex Stochastic Optimization

Asynchronous momentum stochastic gradient descent algorithms (Async-MSGD...
research
01/16/2023

Faster Gradient-Free Algorithms for Nonsmooth Nonconvex Stochastic Optimization

We consider the optimization problem of the form min_x ∈ℝ^d f(x) ≜𝔼_ξ [F...
research
06/12/2021

Distributionally Robust Optimization with Markovian Data

We study a stochastic program where the probability distribution of the ...
research
08/11/2018

A Consistent Method for Learning OOMs from Asymptotically Stationary Time Series Data Containing Missing Values

In the traditional framework of spectral learning of stochastic time ser...
research
02/14/2018

Toward Deeper Understanding of Nonconvex Stochastic Optimization with Momentum using Diffusion Approximations

Momentum Stochastic Gradient Descent (MSGD) algorithm has been widely ap...
research
05/03/2023

Streaming PCA for Markovian Data

Since its inception in Erikki Oja's seminal paper in 1982, Oja's algorit...
research
06/09/2020

Sparse Dynamic Distribution Decomposition: Efficient Integration of Trajectory and SnapshotTime Series Data

Dynamic Distribution Decomposition (DDD) was introduced in Taylor-King e...

Please sign up or login with your details

Forgot password? Click here to reset