PromptCast: A New Prompt-based Learning Paradigm for Time Series Forecasting

09/20/2022
by   Hao Xue, et al.
0

This paper studies the time series forecasting problem from a whole new perspective. In the existing SOTA time-series representation learning methods, the forecasting models take a sequence of numerical values as input and yield numerical values as output. The existing SOTA models are largely based on Transformer architecture, modified with multiple encoding mechanisms to incorporate the context and semantics around the historical data. In this paper, we approach representation learning of time-series from the paradigm of prompt-based natural language modeling. Inspired by the successes of pre-trained language foundation models, we pose a question about whether these models can also be adapted to solve time-series forecasting. Thus, we propose a new forecasting paradigm: prompt-based time series forecasting (PromptCast). In this novel task, the numerical input and output are transformed into prompts. We frame the forecasting task in a sentence-to-sentence manner which makes it possible to directly apply language models for forecasting purposes. To support and facilitate the research of this task, we also present a large-scale dataset (PISA) that includes three real-world forecasting scenarios. We evaluate different SOTA numerical-based forecasting methods and language generation models such as Bart. The benchmark results with single- and multi-step forecasting settings demonstrate that the proposed prompt-based time series forecasting with language generation models is a promising research direction. In addition, in comparison to conventional numerical-based forecasting, PromptCast shows a much better generalization ability under the zero-shot setting. We believe that the proposed PromptCast task as well as our PISA dataset could provide novel insights and further lead to new research directions in the domain of time-series representation learning and forecasting.

READ FULL TEXT
research
07/04/2021

Randomized Neural Networks for Forecasting Time Series with Multiple Seasonality

This work contributes to the development of neural forecasting models wi...
research
03/30/2021

Historical Inertia: An Ignored but Powerful Baseline for Long Sequence Time-series Forecasting

Long sequence time-series forecasting (LSTF) has become increasingly pop...
research
09/11/2022

Leveraging Language Foundation Models for Human Mobility Forecasting

In this paper, we propose a novel pipeline that leverages language found...
research
02/03/2022

CoST: Contrastive Learning of Disentangled Seasonal-Trend Representations for Time Series Forecasting

Deep learning has been actively studied for time series forecasting, and...
research
12/13/2021

Translating Human Mobility Forecasting through Natural Language Generation

Existing human mobility forecasting models follow the standard design of...
research
06/23/2020

Time Series Regression

This paper introduces Time Series Regression (TSR): a little-studied tas...
research
08/30/2022

Persistence Initialization: A novel adaptation of the Transformer architecture for Time Series Forecasting

Time series forecasting is an important problem, with many real world ap...

Please sign up or login with your details

Forgot password? Click here to reset