TEST: Text Prototype Aligned Embedding to Activate LLM's Ability for Time Series

08/16/2023
by   Chenxi Sun, et al.
0

This work summarizes two strategies for completing time-series (TS) tasks using today's language model (LLM): LLM-for-TS, design and train a fundamental large model for TS data; TS-for-LLM, enable the pre-trained LLM to handle TS data. Considering the insufficient data accumulation, limited resources, and semantic context requirements, this work focuses on TS-for-LLM methods, where we aim to activate LLM's ability for TS data by designing a TS embedding method suitable for LLM. The proposed method is named TEST. It first tokenizes TS, builds an encoder to embed them by instance-wise, feature-wise, and text-prototype-aligned contrast, and then creates prompts to make LLM more open to embeddings, and finally implements TS tasks. Experiments are carried out on TS classification and forecasting tasks using 8 LLMs with different structures and sizes. Although its results cannot significantly outperform the current SOTA models customized for TS tasks, by treating LLM as the pattern machine, it can endow LLM's ability to process TS data without compromising the language ability. This paper is intended to serve as a foundational work that will inspire further research.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/07/2019

GRATIS: GeneRAting TIme Series with diverse and controllable characteristics

The explosion of time series data in recent years has brought a flourish...
research
08/16/2023

LLM4TS: Two-Stage Fine-Tuning for Time-Series Forecasting with Pre-Trained LLMs

In this work, we leverage pre-trained Large Language Models (LLMs) to en...
research
04/07/2022

Few-Shot Forecasting of Time-Series with Heterogeneous Channels

Learning complex time series forecasting models usually requires a large...
research
08/17/2022

Class-Aware Visual Prompt Tuning for Vision-Language Pre-Trained Model

With the emergence of large pre-trained vison-language model like CLIP, ...
research
05/10/2018

Towards a universal neural network encoder for time series

We study the use of a time series encoder to learn representations that ...
research
07/07/2016

Predicting and Understanding Law-Making with Word Vectors and an Ensemble Model

Out of nearly 70,000 bills introduced in the U.S. Congress from 2001 to ...
research
09/04/2023

Recognition of Heat-Induced Food State Changes by Time-Series Use of Vision-Language Model for Cooking Robot

Cooking tasks are characterized by large changes in the state of the foo...

Please sign up or login with your details

Forgot password? Click here to reset