BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

10/11/2018
by   Jacob Devlin, et al.
0

We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers. Unlike recent language representation models, BERT is designed to pre-train deep bidirectional representations by jointly conditioning on both left and right context in all layers. As a result, the pre-trained BERT representations can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks, such as question answering and language inference, without substantial task-specific architecture modifications. BERT is conceptually simple and empirically powerful. It obtains new state-of-the-art results on eleven natural language processing tasks, including pushing the GLUE benchmark to 80.4 accuracy to 86.7 (5.6 answering Test F1 to 93.2 (1.5 performance by 2.0

READ FULL TEXT
05/08/2019

Unified Language Model Pre-training for Natural Language Understanding and Generation

This paper presents a new Unified pre-trained Language Model (UniLM) tha...
09/09/2019

Span Selection Pre-training for Question Answering

BERT (Bidirectional Encoder Representations from Transformers) and relat...
10/19/2021

Ensemble ALBERT on SQuAD 2.0

Machine question answering is an essential yet challenging task in natur...
03/16/2020

TRANS-BLSTM: Transformer with Bidirectional LSTM for Language Understanding

Bidirectional Encoder Representations from Transformers (BERT) has recen...
07/29/2021

Adapting GPT, GPT-2 and BERT Language Models for Speech Recognition

Language models (LMs) pre-trained on massive amounts of text, in particu...
04/15/2020

lamBERT: Language and Action Learning Using Multimodal BERT

Recently, the bidirectional encoder representations from transformers (B...
03/30/2021

Kaleido-BERT: Vision-Language Pre-training on Fashion Domain

We present a new vision-language (VL) pre-training model dubbed Kaleido-...

Code Repositories

bert

TensorFlow code and pre-trained models for BERT


view repo

BERT-pytorch

Google AI 2018 BERT pytorch implementation


view repo

keras-bert

Bidirectional Encoder Representations from Transformers


view repo

NLP-BERT--ChineseVersion

this is the code copy from google's BERT model


view repo

bert-chainer

Chainer implementation of "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"


view repo