LightSeq2: Accelerated Training for Transformer-based Models on GPUs

10/12/2021
by   Xiaohui Wang, et al.
5

Transformer-based neural models are used in many AI applications. Training these models is expensive, as it takes huge GPU resources and long duration. It is challenging because typical data like sentences have variable lengths, and Transformer's computation patterns are more complex than convolutional neural networks. Existing systems either only focus on model inference or optimization for only BERT-like encoder models. In this paper, we present LightSeq2, a system to accelerate training for a general family of Transformer models on GPUs. We propose a series of GPU optimization techniques tailored to the specific computation flow and memory access patterns of Transformer models. LightSeq2 supports many model architectures, including BERT (encoder-only), GPT (decoder-only), Transformer (encoder-decoder), and vision Transformer. Our experiments for a variety of models and benchmarks show that LightSeq2 is consistently faster (1.4-3.5x) than previous systems on different GPUs. In particular, it gains 308 large public machine translation benchmark (WMT14 English-German).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/12/2017

Attention Is All You Need

The dominant sequence transduction models are based on complex recurrent...
research
09/23/2022

Faith: An Efficient Framework for Transformer Verification on GPUs

Transformer verification draws increasing attention in machine learning ...
research
10/23/2020

LightSeq: A High Performance Inference Library for Transformers

Transformer, BERT and their variants have achieved great success in natu...
research
08/28/2018

The University of Cambridge's Machine Translation Systems for WMT18

The University of Cambridge submission to the WMT18 news translation tas...
research
11/09/2020

BERT-JAM: Boosting BERT-Enhanced Neural Machine Translation with Joint Attention

BERT-enhanced neural machine translation (NMT) aims at leveraging BERT-e...
research
04/28/2023

FlowTransformer: A Transformer Framework for Flow-based Network Intrusion Detection Systems

This paper presents the FlowTransformer framework, a novel approach for ...
research
04/21/2023

Transformer-based models and hardware acceleration analysis in autonomous driving: A survey

Transformer architectures have exhibited promising performance in variou...

Please sign up or login with your details

Forgot password? Click here to reset