Tsformer: Time series Transformer for tourism demand forecasting

07/22/2021
by   Siyuan Yi, et al.
0

AI-based methods have been widely applied to tourism demand forecasting. However, current AI-based methods are short of the ability to process long-term dependency, and most of them lack interpretability. The Transformer used initially for machine translation shows an incredible ability to long-term dependency processing. Based on the Transformer, we proposed a time series Transformer (Tsformer) with Encoder-Decoder architecture for tourism demand forecasting. The proposed Tsformer encodes long-term dependency with encoder, captures short-term dependency with decoder, and simplifies the attention interactions under the premise of highlighting dominant attention through a series of attention masking mechanisms. These improvements make the multi-head attention mechanism process the input sequence according to the time relationship, contributing to better interpretability. What's more, the context processing ability of the Encoder-Decoder architecture allows adopting the calendar of days to be forecasted to enhance the forecasting performance. Experiments conducted on the Jiuzhaigou valley and Siguniang mountain tourism demand datasets with other nine baseline methods indicate that the proposed Tsformer outperformed all baseline models in the short-term and long-term tourism demand forecasting tasks. Moreover, ablation studies demonstrate that the adoption of the calendar of days to be forecasted contributes to the forecasting performance of the proposed Tsformer. For better interpretability, the attention weight matrix visualization is performed. It indicates that the Tsformer concentrates on seasonal features and days close to days to be forecast in short-term forecasting.

READ FULL TEXT

page 7

page 23

page 24

research
04/17/2023

Long-term Forecasting with TiDE: Time-series Dense Encoder

Recent work has shown that simple linear models can outperform several T...
research
06/08/2023

Does Long-Term Series Forecasting Need Complex Attention and Extra Long Inputs?

As Transformer-based models have achieved impressive performance on vari...
research
12/02/2021

TCTN: A 3D-Temporal Convolutional Transformer Network for Spatiotemporal Predictive Learning

Spatiotemporal predictive learning is to generate future frames given a ...
research
03/31/2023

Asking Better Questions – The Art and Science of Forecasting: A mechanism for truer answers to high-stakes questions

Without the ability to estimate and benchmark AI capability advancements...
research
05/26/2019

Demand Forecasting from Spatiotemporal Data with Graph Networks and Temporal-Guided Embedding

Short-term demand forecasting models commonly combine convolutional and ...
research
05/27/2019

Attention-based Supply-Demand Prediction for Autonomous Vehicles

As one of the important functions of the intelligent transportation syst...
research
05/29/2023

Solar Irradiance Anticipative Transformer

This paper proposes an anticipative transformer-based model for short-te...

Please sign up or login with your details

Forgot password? Click here to reset