Syntax-BERT: Improving Pre-trained Transformers with Syntax Trees

03/07/2021
by   Jiangang Bai, et al.
0

Pre-trained language models like BERT achieve superior performances in various NLP tasks without explicit consideration of syntactic information. Meanwhile, syntactic information has been proved to be crucial for the success of NLP applications. However, how to incorporate the syntax trees effectively and efficiently into pre-trained Transformers is still unsettled. In this paper, we address this problem by proposing a novel framework named Syntax-BERT. This framework works in a plug-and-play mode and is applicable to an arbitrary pre-trained checkpoint based on Transformer architecture. Experiments on various datasets of natural language understanding verify the effectiveness of syntax trees and achieve consistent improvement over multiple pre-trained models, including BERT, RoBERTa, and T5.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/14/2022

The Dark Side of the Language: Pre-trained Transformers in the DarkNet

Pre-trained Transformers are challenging human performances in many natu...
research
12/30/2020

Improving BERT with Syntax-aware Local Attention

Pre-trained Transformer-based neural language models, such as BERT, have...
research
12/30/2020

Unnatural Language Inference

Natural Language Understanding has witnessed a watershed moment with the...
research
08/20/2020

Do Syntax Trees Help Pre-trained Transformers Extract Information?

Much recent work suggests that incorporating syntax information from dep...
research
11/10/2019

Syntax-Infused Transformer and BERT models for Machine Translation and Natural Language Understanding

Attention-based models have shown significant improvement over tradition...
research
06/11/2019

What Does BERT Look At? An Analysis of BERT's Attention

Large pre-trained neural networks such as BERT have had great recent suc...
research
04/09/2021

Transformers: "The End of History" for NLP?

Recent advances in neural architectures, such as the Transformer, couple...

Please sign up or login with your details

Forgot password? Click here to reset