A Transformer Framework for Data Fusion and Multi-Task Learning in Smart Cities

11/18/2022
by   Alexander C. DeRieux, et al.
0

Rapid global urbanization is a double-edged sword, heralding promises of economical prosperity and public health while also posing unique environmental and humanitarian challenges. Smart and connected communities (S CCs) apply data-centric solutions to these problems by integrating artificial intelligence (AI) and the Internet of Things (IoT). This coupling of intelligent technologies also poses interesting system design challenges regarding heterogeneous data fusion and task diversity. Transformers are of particular interest to address these problems, given their success across diverse fields of natural language processing (NLP), computer vision, time-series regression, and multi-modal data fusion. This begs the question whether Transformers can be further diversified to leverage fusions of IoT data sources for heterogeneous multi-task learning in S CC trade spaces. In this paper, a Transformer-based AI system for emerging smart cities is proposed. Designed using a pure encoder backbone, and further customized through interchangeable input embedding and output task heads, the system supports virtually any input data and output task types present S CCs. This generalizability is demonstrated through learning diverse task sets representative of S CC environments, including multivariate time-series regression, visual plant disease classification, and image-time-series fusion tasks using a combination of Beijing PM2.5 and Plant Village datasets. Simulation results show that the proposed Transformer-based system can handle various input data types via custom sequence embedding techniques, and are naturally suited to learning a diverse set of tasks. The results also show that multi-task learners increase both memory and computational efficiency while maintaining comparable performance to both single-task variants, and non-Transformer baselines.

READ FULL TEXT

page 1

page 7

page 11

page 13

research
02/14/2018

Disjoint Multi-task Learning between Heterogeneous Human-centric Tasks

Human behavior understanding is arguably one of the most important mid-l...
research
12/21/2017

Multi-task learning of time series and its application to the travel demand

We address the problem of modeling and prediction of a set of temporal e...
research
02/27/2017

Co-evolutionary multi-task learning for dynamic time series prediction

Multi-task learning employs shared representation of knowledge for learn...
research
06/11/2023

A Comprehensive Survey on Applications of Transformers for Deep Learning Tasks

Transformer is a deep neural network that employs a self-attention mecha...
research
05/13/2021

Paying Attention to Astronomical Transients: Photometric Classification with the Time-Series Transformer

Future surveys such as the Legacy Survey of Space and Time (LSST) of the...
research
07/18/2023

U-shaped Transformer: Retain High Frequency Context in Time Series Analysis

Time series prediction plays a crucial role in various industrial fields...
research
07/05/2022

Array Camera Image Fusion using Physics-Aware Transformers

We demonstrate a physics-aware transformer for feature-based data fusion...

Please sign up or login with your details

Forgot password? Click here to reset