Detecting Spells in Fantasy Literature with a Transformer Based Artificial Intelligence

08/07/2023
by   Marcel Moravek, et al.
0

Transformer architectures and models have made significant progress in language-based tasks. In this area, is BERT one of the most widely used and freely available transformer architecture. In our work, we use BERT for context-based phrase recognition of magic spells in the Harry Potter novel series. Spells are a common part of active magic in fantasy novels. Typically, spells are used in a specific context to achieve a supernatural effect. A series of investigations were conducted to see if a Transformer architecture could recognize such phrases based on their context in the Harry Potter saga. For our studies a pre-trained BERT model was used and fine-tuned utilising different datasets and training methods to identify the searched context. By considering different approaches for sequence classification as well as token classification, it is shown that the context of spells can be recognised. According to our investigations, the examined sequence length for fine-tuning and validation of the model plays a significant role in context recognition. Based on this, we have investigated whether spells have overarching properties that allow a transfer of the neural network models to other fantasy universes as well. The application of our model showed promising results and is worth to be deepened in subsequent studies.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/03/2020

LT@Helsinki at SemEval-2020 Task 12: Multilingual or language-specific BERT?

This paper presents the different models submitted by the LT@Helsinki te...
research
02/06/2023

Controllable Lexical Simplification for English

Fine-tuning Transformer-based approaches have recently shown exciting re...
research
05/19/2022

ArabGlossBERT: Fine-Tuning BERT on Context-Gloss Pairs for WSD

Using pre-trained transformer models such as BERT has proven to be effec...
research
07/12/2022

A new hope for network model generalization

Generalizing machine learning (ML) models for network traffic dynamics t...
research
06/14/2021

Why Can You Lay Off Heads? Investigating How BERT Heads Transfer

The huge size of the widely used BERT family models has led to recent ef...
research
01/09/2023

Transfer learning for conflict and duplicate detection in software requirement pairs

Consistent and holistic expression of software requirements is important...
research
02/14/2020

Transformer on a Diet

Transformer has been widely used thanks to its ability to capture sequen...

Please sign up or login with your details

Forgot password? Click here to reset