Machine Learning of Time Series Using Time-Delay Embedding and Precision Annealing

Neural Comput. 2019 Oct;31(10):2004-2024. doi: 10.1162/neco_a_01224. Epub 2019 Aug 8.

Abstract

Tasking machine learning to predict segments of a time series requires estimating the parameters of a ML model with input/output pairs from the time series. We borrow two techniques used in statistical data assimilation in order to accomplish this task: time-delay embedding to prepare our input data and precision annealing as a training method. The precision annealing approach identifies the global minimum of the action (-log[P]). In this way, we are able to identify the number of training pairs required to produce good generalizations (predictions) for the time series. We proceed from a scalar time series s(tn);tn=t0+nΔt and, using methods of nonlinear time series analysis, show how to produce a DE>1-dimensional time-delay embedding space in which the time series has no false neighbors as does the observed s(tn) time series. In that DE-dimensional space, we explore the use of feedforward multilayer perceptrons as network models operating on DE-dimensional input and producing DE-dimensional outputs.

Publication types

  • Research Support, Non-U.S. Gov't