A Comprehensive Survey on Applications of Transformers for Deep Learning Tasks

06/11/2023
by   Saidul Islam, et al.
0

Transformer is a deep neural network that employs a self-attention mechanism to comprehend the contextual relationships within sequential data. Unlike conventional neural networks or updated versions of Recurrent Neural Networks (RNNs) such as Long Short-Term Memory (LSTM), transformer models excel in handling long dependencies between input sequence elements and enable parallel processing. As a result, transformer-based models have attracted substantial interest among researchers in the field of artificial intelligence. This can be attributed to their immense potential and remarkable achievements, not only in Natural Language Processing (NLP) tasks but also in a wide range of domains, including computer vision, audio and speech processing, healthcare, and the Internet of Things (IoT). Although several survey papers have been published highlighting the transformer's contributions in specific fields, architectural differences, or performance evaluations, there is still a significant absence of a comprehensive survey paper encompassing its major applications across various domains. Therefore, we undertook the task of filling this gap by conducting an extensive survey of proposed transformer models from 2017 to 2022. Our survey encompasses the identification of the top five application domains for transformer-based models, namely: NLP, Computer Vision, Multi-Modality, Audio and Speech Processing, and Signal Processing. We analyze the impact of highly influential transformer-based models in these domains and subsequently classify them based on their respective tasks using a proposed taxonomy. Our aim is to shed light on the existing potential and future possibilities of transformers for enthusiastic researchers, thus contributing to the broader understanding of this groundbreaking technology.

READ FULL TEXT

page 13

page 14

research
06/08/2021

A Survey of Transformers

Transformers have achieved great success in many artificial intelligence...
research
08/12/2020

Compression of Deep Learning Models for Text: A Survey

In recent years, the fields of natural language processing (NLP) and inf...
research
06/04/2019

Nemesyst: A Hybrid Parallelism Deep Learning-Based Framework Applied for Internet of Things Enabled Food Retailing Refrigeration Systems

Deep Learning has attracted considerable attention across multiple appli...
research
03/21/2023

Transformers in Speech Processing: A Survey

The remarkable success of transformers in the field of natural language ...
research
03/26/2021

A Practical Survey on Faster and Lighter Transformers

Recurrent neural networks are effective models to process sequences. How...
research
11/18/2022

A Transformer Framework for Data Fusion and Multi-Task Learning in Smart Cities

Rapid global urbanization is a double-edged sword, heralding promises of...

Please sign up or login with your details

Forgot password? Click here to reset