AdapterHub Playground: Simple and Flexible Few-Shot Learning with Adapters

08/18/2021
by   Tilman Beck, et al.
6

The open-access dissemination of pretrained language models through online repositories has led to a democratization of state-of-the-art natural language processing (NLP) research. This also allows people outside of NLP to use such models and adapt them to specific use-cases. However, a certain amount of technical proficiency is still required which is an entry barrier for users who want to apply these models to a certain task but lack the necessary knowledge or resources. In this work, we aim to overcome this gap by providing a tool which allows researchers to leverage pretrained models without writing a single line of code. Built upon the parameter-efficient adapter modules for transfer learning, our AdapterHub Playground provides an intuitive interface, allowing the usage of adapters for prediction, training and analysis of textual data for a variety of NLP tasks. We present the tool's architecture and demonstrate its advantages with prototypical use-cases, where we show that predictive performance can easily be increased in a few-shot learning scenario. Finally, we evaluate its usability in a user study. We provide the code and a live interface at https://adapter-hub.github.io/playground.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

09/06/2021

GPT-3 Models are Poor Few-Shot Learners in the Biomedical Domain

Deep neural language models have set new breakthroughs in many tasks of ...
08/22/2019

Improving Few-shot Text Classification via Pretrained Language Representations

Text classification tends to be difficult when the data is deficient or ...
12/02/2019

EduBERT: Pretrained Deep Language Models for Learning Analytics

The use of large pretrained neural networks to create contextualized wor...
07/06/2017

An Interactive Tool for Natural Language Processing on Clinical Text

Natural Language Processing (NLP) systems often make use of machine lear...
10/09/2019

Transformers: State-of-the-art Natural Language Processing

Recent advances in modern Natural Language Processing (NLP) research hav...
04/04/2019

Visualizing Attention in Transformer-Based Language models

We present an open-source tool for visualizing multi-head self-attention...
07/21/2021

Design of a Graphical User Interface for Few-Shot Machine Learning Classification of Electron Microscopy Data

The recent growth in data volumes produced by modern electron microscope...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.