Typhoon: Towards an Effective Task-Specific Masking Strategy for Pre-trained Language Models

Through exploiting a high level of parallelism enabled by graphics processing units, transformer architectures have enabled tremendous strides forward in the field of natural language processing. In a traditional masked language model, special MASK tokens are used to prompt our model to gather contextual information from surrounding words to restore originally hidden information. In this paper, we explore a task-specific masking framework for pre-trained large language models that enables superior performance on particular downstream tasks on the datasets in the GLUE benchmark. We develop our own masking algorithm, Typhoon, based on token input gradients, and compare this with other standard baselines. We find that Typhoon offers performance competitive with whole-word masking on the MRPC dataset. Our implementation can be found in a public Github Repository.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2021

A Comprehensive Exploration of Pre-training Language Models

Recently, the development of pre-trained language models has brought nat...
research
12/16/2021

CLIN-X: pre-trained language models and a study on cross-task transfer for concept extraction in the clinical domain

The field of natural language processing (NLP) has recently seen a large...
research
04/08/2020

Exploiting Redundancy in Pre-trained Language Models for Efficient Transfer Learning

Large pre-trained contextual word representations have transformed the f...
research
08/05/2023

PromptCARE: Prompt Copyright Protection by Watermark Injection and Verification

Large language models (LLMs) have witnessed a meteoric rise in popularit...
research
11/12/2020

Context-aware Stand-alone Neural Spelling Correction

Existing natural language processing systems are vulnerable to noisy inp...
research
09/16/2021

MeLT: Message-Level Transformer with Masked Document Representations as Pre-Training for Stance Detection

Much of natural language processing is focused on leveraging large capac...
research
12/14/2022

Towards mapping the contemporary art world with ArtLM: an art-specific NLP model

With an increasing amount of data in the art world, discovering artists ...

Please sign up or login with your details

Forgot password? Click here to reset