Legal Transformer Models May Not Always Help

09/14/2021
by   Saibo Geng, et al.
0

Deep learning-based Natural Language Processing methods, especially transformers, have achieved impressive performance in the last few years. Applying those state-of-the-art NLP methods to legal activities to automate or simplify some simple work is of great value. This work investigates the value of domain adaptive pre-training and language adapters in legal NLP tasks. By comparing the performance of language models with domain adaptive pre-training on different tasks and different dataset splits, we show that domain adaptive pre-training is only helpful with low-resource downstream tasks, thus far from being a panacea. We also benchmark the performance of adapters in a typical legal NLP task and show that they can yield similar performance to full model tuning with much smaller training costs. As an additional result, we release LegalRoBERTa, a RoBERTa model further pre-trained on legal corpora.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/13/2022

Pre-training Transformers on Indian Legal Text

Natural Language Processing in the legal domain been benefited hugely by...
research
11/05/2022

Privacy-Preserving Models for Legal Natural Language Processing

Pre-training large transformer models with in-domain data improves domai...
research
10/25/2022

Parameter-Efficient Legal Domain Adaptation

Seeking legal advice is often expensive. Recent advancement in machine l...
research
11/02/2022

Processing Long Legal Documents with Pre-trained Transformers: Modding LegalBERT and Longformer

Pre-trained Transformers currently dominate most NLP tasks. They impose,...
research
07/18/2022

Towards a General Pre-training Framework for Adaptive Learning in MOOCs

Adaptive learning aims to stimulate and meet the needs of individual lea...
research
05/09/2023

CaseEncoder: A Knowledge-enhanced Pre-trained Model for Legal Case Encoding

Legal case retrieval is a critical process for modern legal information ...
research
08/10/2023

Bringing order into the realm of Transformer-based language models for artificial intelligence and law

Transformer-based language models (TLMs) have widely been recognized to ...

Please sign up or login with your details

Forgot password? Click here to reset