Machine Learning of Time Series Using Time-delay Embedding and Precision Annealing

02/12/2019
by   Alexander J. A. Ty, et al.
0

Tasking machine learning to predict segments of a time series requires estimating the parameters of a ML model with input/output pairs from the time series. Using the equivalence between statistical data assimilation and supervised machine learning, we revisit this task. The training method for the machine utilizes a precision annealing approach to identifying the global minimum of the action (-log[P]). In this way we are able to identify the number of training pairs required to produce good generalizations (predictions) for the time series. We proceed from a scalar time series s(t_n); t_n = t_0 + n Δ t and using methods of nonlinear time series analysis show how to produce a D_E > 1 dimensional time delay embedding space in which the time series has no false neighbors as does the observed s(t_n) time series. In that D_E-dimensional space we explore the use of feed forward multi-layer perceptrons as network models operating on D_E-dimensional input and producing D_E-dimensional outputs.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset