DeepAI
Log In Sign Up

BERT-GT: Cross-sentence n-ary relation extraction with BERT and Graph Transformer

01/11/2021
by   Po-Ting Lai, et al.
0

A biomedical relation statement is commonly expressed in multiple sentences and consists of many concepts, including gene, disease, chemical, and mutation. To automatically extract information from biomedical literature, existing biomedical text-mining approaches typically formulate the problem as a cross-sentence n-ary relation-extraction task that detects relations among n entities across multiple sentences, and use either a graph neural network (GNN) with long short-term memory (LSTM) or an attention mechanism. Recently, Transformer has been shown to outperform LSTM on many natural language processing (NLP) tasks. In this work, we propose a novel architecture that combines Bidirectional Encoder Representations from Transformers with Graph Transformer (BERT-GT), through integrating a neighbor-attention mechanism into the BERT architecture. Unlike the original Transformer architecture, which utilizes the whole sentence(s) to calculate the attention of the current token, the neighbor-attention mechanism in our method calculates its attention utilizing only its neighbor tokens. Thus, each token can pay attention to its neighbor information with little noise. We show that this is critically important when the text is very long, as in cross-sentence or abstract-level relation-extraction tasks. Our benchmarking results show improvements of 5.44 and 3.89 chemical-protein relation datasets, suggesting BERT-GT is a robust approach that is applicable to other biomedical relation extraction tasks or datasets.

READ FULL TEXT
04/22/2021

Enriched Attention for Robust Relation Extraction

The performance of relation extraction models has increased considerably...
04/08/2022

BioRED: A Comprehensive Biomedical Relation Extraction Dataset

Automated relation extraction (RE) from biomedical literature is critica...
11/01/2020

Investigation of BERT Model on Biomedical Relation Extraction Based on Revised Fine-tuning Mechanism

With the explosive growth of biomedical literature, designing automatic ...
01/01/2021

MrGCN: Mirror Graph Convolution Network for Relation Extraction with Long-Term Dependencies

The ability to capture complex linguistic structures and long-term depen...
06/18/2019

Transfer Learning for Causal Sentence Detection

We consider the task of detecting sentences that express causality, as a...
12/23/2019

Combining Context and Knowledge Representations for Chemical-Disease Relation Extraction

Automatically extracting the relationships between chemicals and disease...
08/28/2018

N-ary Relation Extraction using Graph State LSTM

Cross-sentence n-ary relation extraction detects relations among n entit...