fairseq: A Fast, Extensible Toolkit for Sequence Modeling

04/01/2019
by   Myle Ott, et al.
0

fairseq is an open-source sequence modeling toolkit that allows researchers and developers to train custom models for translation, summarization, language modeling, and other text generation tasks. The toolkit is based on PyTorch and supports distributed training across multiple GPUs and machines. We also support fast mixed-precision training and inference on modern GPUs. A demo video can be found at https://www.youtube.com/watch?v=OtgDdWtHvto

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/25/2018

OpenSeq2Seq: extensible toolkit for distributed and mixed precision training of sequence-to-sequence models

We present OpenSeq2Seq -- an open-source toolkit for training sequence-t...
research
09/04/2018

Texar: A Modularized, Versatile, and Extensible Toolkit for Text Generation

We introduce Texar, an open-source toolkit aiming to support the broad s...
research
05/28/2019

LeafNATS: An Open-Source Toolkit and Live Demo System for Neural Abstractive Text Summarization

Neural abstractive text summarization (NATS) has received a lot of atten...
research
09/28/2019

OpenNRE: An Open and Extensible Toolkit for Neural Relation Extraction

OpenNRE is an open-source and extensible toolkit that provides a unified...
research
06/14/2018

NCRF++: An Open-source Neural Sequence Labeling Toolkit

This paper describes NCRF++, a toolkit for neural sequence labeling. NCR...
research
10/05/2017

Open Badges: A Low-Cost Toolkit for Measuring Team Communication and Dynamics

We present Open Badges, an open-source framework an toolkit for measurin...
research
11/23/2022

TorchScale: Transformers at Scale

Large Transformers have achieved state-of-the-art performance across man...

Please sign up or login with your details

Forgot password? Click here to reset