The Tiny Time-series Transformer: Low-latency High-throughput Classification of Astronomical Transients using Deep Model Compression

03/15/2023
by   Tarek Allam Jr., et al.
0

A new golden age in astronomy is upon us, dominated by data. Large astronomical surveys are broadcasting unprecedented rates of information, demanding machine learning as a critical component in modern scientific pipelines to handle the deluge of data. The upcoming Legacy Survey of Space and Time (LSST) of the Vera C. Rubin Observatory will raise the big-data bar for time-domain astronomy, with an expected 10 million alerts per-night, and generating many petabytes of data over the lifetime of the survey. Fast and efficient classification algorithms that can operate in real-time, yet robustly and accurately, are needed for time-critical events where additional resources can be sought for follow-up analyses. In order to handle such data, state-of-the-art deep learning architectures coupled with tools that leverage modern hardware accelerators are essential. We showcase how the use of modern deep compression methods can achieve a 18× reduction in model size, whilst preserving classification performance. We also show that in addition to the deep compression techniques, careful choice of file formats can improve inference latency, and thereby throughput of alerts, on the order of 8× for local processing, and 5× in a live production setting. To test this in a live setting, we deploy this optimised version of the original time-series transformer, t2, into the community alert broking system of FINK on real Zwicky Transient Facility (ZTF) alert data, and compare throughput performance with other science modules that exist in FINK. The results shown herein emphasise the time-series transformer's suitability for real-time classification at LSST scale, and beyond, and introduce deep model compression as a fundamental tool for improving deploy-ability and scalable inference of deep learning models for transient classification.

READ FULL TEXT

page 6

page 10

page 12

page 14

research
05/13/2021

Paying Attention to Astronomical Transients: Photometric Classification with the Time-Series Transformer

Future surveys such as the Legacy Survey of Space and Time (LSST) of the...
research
06/30/2022

DeepSpeed Inference: Enabling Efficient Inference of Transformer Models at Unprecedented Scale

The past several years have witnessed the success of transformer-based m...
research
03/26/2021

Gated Transformer Networks for Multivariate Time Series Classification

Deep learning model (primarily convolutional networks and LSTM) for time...
research
03/29/2019

RAPID: Early Classification of Explosive Transients using Deep Learning

We present RAPID (Real-time Automated Photometric IDentification), a nov...
research
02/27/2018

A High GOPs/Slice Time Series Classifier for Portable and Embedded Biomedical Applications

Modern wearable rehabilitation devices and health support systems operat...
research
02/26/2021

Tails: Chasing Comets with the Zwicky Transient Facility and Deep Learning

We present Tails, an open-source deep-learning framework for the identif...
research
02/10/2020

A Computational Approach to Packet Classification

Multi-field packet classification is a crucial component in modern softw...

Please sign up or login with your details

Forgot password? Click here to reset