Examining the Effect of Pre-training on Time Series Classification

09/11/2023
by   Jiashu Pu, et al.
0

Although the pre-training followed by fine-tuning paradigm is used extensively in many fields, there is still some controversy surrounding the impact of pre-training on the fine-tuning process. Currently, experimental findings based on text and image data lack consensus. To delve deeper into the unsupervised pre-training followed by fine-tuning paradigm, we have extended previous research to a new modality: time series. In this study, we conducted a thorough examination of 150 classification datasets derived from the Univariate Time Series (UTS) and Multivariate Time Series (MTS) benchmarks. Our analysis reveals several key conclusions. (i) Pre-training can only help improve the optimization process for models that fit the data poorly, rather than those that fit the data well. (ii) Pre-training does not exhibit the effect of regularization when given sufficient training time. (iii) Pre-training can only speed up convergence if the model has sufficient ability to fit the data. (iv) Adding more pre-training data does not improve generalization, but it can strengthen the advantage of pre-training on the original data volume, such as faster convergence. (v) While both the pre-training task and the model structure determine the effectiveness of the paradigm on a given dataset, the model structure plays a more significant role.

READ FULL TEXT
research
02/27/2023

The Role of Pre-training Data in Transfer Learning

The transfer learning paradigm of model pre-training and subsequent fine...
research
05/21/2020

Text-to-Text Pre-Training for Data-to-Text Tasks

We study the pre-train + fine-tune strategy for data-to-text tasks. Fine...
research
02/02/2023

SimMTM: A Simple Pre-Training Framework for Masked Time-Series Modeling

Time series analysis is widely used in extensive areas. Recently, to red...
research
07/20/2020

A Comprehensive Evaluation of Multi-task Learning and Multi-task Pre-training on EHR Time-series Data

Multi-task learning (MTL) is a machine learning technique aiming to impr...
research
06/17/2022

Self-Supervised Contrastive Pre-Training For Time Series via Time-Frequency Consistency

Pre-training on time series poses a unique challenge due to the potentia...
research
05/02/2023

Discovering the Effectiveness of Pre-Training in a Large-scale Car-sharing Platform

Recent progress of deep learning has empowered various intelligent trans...
research
07/30/2022

Revisiting the Critical Factors of Augmentation-Invariant Representation Learning

We focus on better understanding the critical factors of augmentation-in...

Please sign up or login with your details

Forgot password? Click here to reset