Paying Attention to Astronomical Transients: Photometric Classification with the Time-Series Transformer

05/13/2021
by   Tarek Allam Jr., et al.
0

Future surveys such as the Legacy Survey of Space and Time (LSST) of the Vera C. Rubin Observatory will observe an order of magnitude more astrophysical transient events than any previous survey before. With this deluge of photometric data, it will be impossible for all such events to be classified by humans alone. Recent efforts have sought to leverage machine learning methods to tackle the challenge of astronomical transient classification, with ever improving success. Transformers are a recently developed deep learning architecture, first proposed for natural language processing, that have shown a great deal of recent success. In this work we develop a new transformer architecture, which uses multi-head self attention at its core, for general multi-variate time-series data. Furthermore, the proposed time-series transformer architecture supports the inclusion of an arbitrary number of additional features, while also offering interpretability. We apply the time-series transformer to the task of photometric classification, minimising the reliance of expert domain knowledge for feature selection, while achieving results comparable to state-of-the-art photometric classification methods. We achieve a weighted logarithmic-loss of 0.507 on imbalanced data in a representative setting using data from the Photometric LSST Astronomical Time-Series Classification Challenge (PLAsTiCC). Moreover, we achieve a micro-averaged receiver operating characteristic area under curve of 0.98 and micro-averaged precision-recall area under curve of 0.87.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 14

02/03/2022

ETSformer: Exponential Smoothing Transformers for Time-series Forecasting

Transformers have been actively studied for time-series forecasting in r...
03/26/2021

Gated Transformer Networks for Multivariate Time Series Classification

Deep learning model (primarily convolutional networks and LSTM) for time...
03/29/2019

RAPID: Early Classification of Explosive Transients using Deep Learning

We present RAPID (Real-time Automated Photometric IDentification), a nov...
04/09/2021

Deep Transformer Networks for Time Series Classification: The NPP Safety Case

A challenging part of dynamic probabilistic risk assessment for nuclear ...
02/15/2022

Transformers in Time Series: A Survey

Transformers have achieved superior performances in many tasks in natura...
10/05/2021

Attention Augmented Convolutional Transformer for Tabular Time-series

Time-series classification is one of the most frequently performed tasks...
05/02/2022

DeepGraviLens: a Multi-Modal Architecture for Classifying Gravitational Lensing Data

Gravitational lensing is the relativistic effect generated by massive bo...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.