A Transformer-based Framework For Multi-variate Time Series: A Remaining Useful Life Prediction Use Case

08/19/2023
by   Oluwaseyi Ogunfowora, et al.
0

In recent times, Large Language Models (LLMs) have captured a global spotlight and revolutionized the field of Natural Language Processing. One of the factors attributed to the effectiveness of LLMs is the model architecture used for training, transformers. Transformer models excel at capturing contextual features in sequential data since time series data are sequential, transformer models can be leveraged for more efficient time series data prediction. The field of prognostics is vital to system health management and proper maintenance planning. A reliable estimation of the remaining useful life (RUL) of machines holds the potential for substantial cost savings. This includes avoiding abrupt machine failures, maximizing equipment usage, and serving as a decision support system (DSS). This work proposed an encoder-transformer architecture-based framework for multivariate time series prediction for a prognostics use case. We validated the effectiveness of the proposed framework on all four sets of the C-MAPPS benchmark dataset for the remaining useful life prediction task. To effectively transfer the knowledge and application of transformers from the natural language domain to time series, three model-specific experiments were conducted. Also, to enable the model awareness of the initial stages of the machine life and its degradation path, a novel expanding window method was proposed for the first time in this work, it was compared with the sliding window method, and it led to a large improvement in the performance of the encoder transformer model. Finally, the performance of the proposed encoder-transformer model was evaluated on the test dataset and compared with the results from 13 other state-of-the-art (SOTA) models in the literature and it outperformed them all with an average performance increase of 137.65 datasets.

READ FULL TEXT

page 11

page 20

research
09/08/2022

W-Transformers : A Wavelet-based Transformer Framework for Univariate Time Series Forecasting

Deep learning utilizing transformers has recently achieved a lot of succ...
research
11/10/2021

Soft Sensing Transformer: Hundreds of Sensors are Worth a Single Word

With the rapid development of AI technology in recent years, there have ...
research
11/23/2020

Remaining Useful Life Estimation Under Uncertainty with Causal GraphNets

In this work, a novel approach for the construction and training of time...
research
09/04/2021

An empirical evaluation of attention-based multi-head models for improved turbofan engine remaining useful life prediction

A single unit (head) is the conventional input feature extractor in deep...
research
05/13/2021

Paying Attention to Astronomical Transients: Photometric Classification with the Time-Series Transformer

Future surveys such as the Legacy Survey of Space and Time (LSST) of the...
research
05/15/2019

A Neural Network-Evolutionary Computational Framework for Remaining Useful Life Estimation of Mechanical Systems

This paper presents a framework for estimating the remaining useful life...

Please sign up or login with your details

Forgot password? Click here to reset