Towards a universal neural network encoder for time series

05/10/2018
by   Joan Serrà, et al.
0

We study the use of a time series encoder to learn representations that are useful on data set types with which it has not been trained on. The encoder is formed of a convolutional neural network whose temporal output is summarized by a convolutional attention mechanism. This way, we obtain a compact, fixed-length representation from longer, variable-length time series. We evaluate the performance of the proposed approach on a well-known time series classification benchmark, considering full adaptation, partial adaptation, and no adaptation of the encoder to the new data type. Results show that such strategies are competitive with the state-of-the-art, often outperforming conceptually-matching approaches. Besides accuracy scores, the facility of adaptation and the efficiency of pre-trained encoders make them an appealing option for the processing of scarcely- or non-labeled time series.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/19/2022

A Convolutional Neural Network Approach to Supernova Time-Series Classification

One of the brightest objects in the universe, supernovae (SNe) are power...
research
09/11/2019

DreamTime: Finding AlexNet for Time Series Classification

Time series classification (TSC) is the area of machine learning interes...
research
01/07/2019

A Compact Representation of Raster Time Series

The raster model is widely used in Geographic Information Systems to rep...
research
09/11/2019

InceptionTime: Finding AlexNet for Time Series Classification

Time series classification (TSC) is the area of machine learning interes...
research
01/25/2021

Spectrum Attention Mechanism for Time Series Classification

Time series classification(TSC) has always been an important and challen...
research
06/19/2020

Supporting Optimal Phase Space Reconstructions Using Neural Network Architecture for Time Series Modeling

The reconstruction of phase spaces is an essential step to analyze time ...
research
08/16/2023

TEST: Text Prototype Aligned Embedding to Activate LLM's Ability for Time Series

This work summarizes two strategies for completing time-series (TS) task...

Please sign up or login with your details

Forgot password? Click here to reset