Exploring the Promises of Transformer-Based LMs for the Representation of Normative Claims in the Legal Domain

08/25/2021
by   Reto Gubelmann, et al.
0

In this article, we explore the potential of transformer-based language models (LMs) to correctly represent normative statements in the legal domain, taking tax law as our use case. In our experiment, we use a variety of LMs as bases for both word- and sentence-based clusterers that are then evaluated on a small, expert-compiled test-set, consisting of real-world samples from tax law research literature that can be clearly assigned to one of four normative theories. The results of the experiment show that clusterers based on sentence-BERT-embeddings deliver the most promising results. Based on this main experiment, we make first attempts at using the best performing models in a bootstrapping loop to build classifiers that map normative claims on one of these four normative theories.

READ FULL TEXT
research
08/10/2023

Bringing order into the realm of Transformer-based language models for artificial intelligence and law

Transformer-based language models (TLMs) have widely been recognized to ...
research
03/16/2023

A Short Survey of Viewing Large Language Models in Legal Aspect

Large language models (LLMs) have transformed many fields, including nat...
research
09/30/2021

Multi-granular Legal Topic Classification on Greek Legislation

In this work, we study the task of classifying legal texts written in th...
research
03/25/2023

From Gödel's Incompleteness Theorem to the completeness of bot religions (Extended abstract)

Hilbert and Ackermann asked for a method to consistently extend incomple...
research
10/03/2018

Fast Approach to Build an Automatic Sentiment Annotator for Legal Domain using Transfer Learning

This study proposes a novel way of identifying the sentiment of the phra...
research
06/23/2022

Evaluating Generative Patent Language Models

This research aims to build generative language models in the patent dom...
research
12/15/2021

Lex Rosetta: Transfer of Predictive Models Across Languages, Jurisdictions, and Legal Domains

In this paper, we examine the use of multi-lingual sentence embeddings t...

Please sign up or login with your details

Forgot password? Click here to reset