Neural Contract Element Extraction Revisited

01/12/2021
by   Ilias Chalkidis, et al.
0

We investigate contract element extraction. We show that LSTM-based encoders perform better than dilated CNNs, Transformers, and BERT in this task. We also find that domain-specific WORD2VEC embeddings outperform generic pre-trained GLOVE embeddings. Morpho-syntactic features in the form of POS tag and token shape embeddings, as well as context-aware ELMO embeddings, do not improve performance. Several of these observations contradict choices or findings of previous work on contract element extraction and generic sequence labeling tasks, indicating that contract element extraction requires careful task-specific choices.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/13/2021

Cross-Domain Contract Element Extraction with a Bi-directional Feedback Clause-Element Relation Network

Contract element extraction (CEE) is the novel task of automatically ide...
research
09/05/2019

Fusing Vector Space Models for Domain-Specific Applications

We address the problem of tuning word embeddings for specific use cases ...
research
08/07/2018

Word-Level Loss Extensions for Neural Temporal Relation Classification

Unsupervised pre-trained word embeddings are used effectively for many t...
research
05/11/2018

Double Embeddings and CNN-based Sequence Labeling for Aspect Extraction

One key task of fine-grained sentiment analysis of product reviews is to...
research
05/21/2019

Look Again at the Syntax: Relational Graph Convolutional Network for Gendered Ambiguous Pronoun Resolution

Gender bias has been found in existing coreference resolvers. In order t...
research
06/26/2019

Enhancing PIO Element Detection in Medical Text Using Contextualized Embedding

In this paper, we investigate a new approach to Population, Intervention...

Please sign up or login with your details

Forgot password? Click here to reset