PAITS: Pretraining and Augmentation for Irregularly-Sampled Time Series

08/25/2023
by   Nicasia Beebe-Wang, et al.
0

Real-world time series data that commonly reflect sequential human behavior are often uniquely irregularly sampled and sparse, with highly nonuniform sampling over time and entities. Yet, commonly-used pretraining and augmentation methods for time series are not specifically designed for such scenarios. In this paper, we present PAITS (Pretraining and Augmentation for Irregularly-sampled Time Series), a framework for identifying suitable pretraining strategies for sparse and irregularly sampled time series datasets. PAITS leverages a novel combination of NLP-inspired pretraining tasks and augmentations, and a random search to identify an effective strategy for a given dataset. We demonstrate that different datasets benefit from different pretraining choices. Compared with prior methods, our approach is better able to consistently improve pretraining across multiple datasets and domains. Our code is available at <https://github.com/google-research/google-research/tree/master/irregular_timeseries_pretraining>.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/22/2021

ImageNet-21K Pretraining for the Masses

ImageNet-1K serves as the primary dataset for pretraining deep learning ...
research
12/19/2022

Dynamic Sparse Network for Time Series Classification: Learning What to "see”

The receptive field (RF), which determines the region of time series to ...
research
02/08/2023

DeepVATS: Deep Visual Analytics for Time Series

The field of Deep Visual Analytics (DVA) has recently arisen from the id...
research
09/12/2023

Frequency-Aware Masked Autoencoders for Multimodal Pretraining on Biosignals

Leveraging multimodal information from biosignals is vital for building ...
research
12/09/2021

Sampling rate-corrected analysis of irregularly sampled time series

The analysis of irregularly sampled time series remains a challenging ta...
research
09/26/2019

Set Functions for Time Series

Despite the eminent successes of deep neural networks, many architecture...
research
08/08/2022

Label-Free Synthetic Pretraining of Object Detectors

We propose a new approach, Synthetic Optimized Layout with Instance Dete...

Please sign up or login with your details

Forgot password? Click here to reset