AdapterHub Playground: Simple and Flexible Few-Shot Learning with Adapters

08/18/2021
by   Tilman Beck, et al.
6

The open-access dissemination of pretrained language models through online repositories has led to a democratization of state-of-the-art natural language processing (NLP) research. This also allows people outside of NLP to use such models and adapt them to specific use-cases. However, a certain amount of technical proficiency is still required which is an entry barrier for users who want to apply these models to a certain task but lack the necessary knowledge or resources. In this work, we aim to overcome this gap by providing a tool which allows researchers to leverage pretrained models without writing a single line of code. Built upon the parameter-efficient adapter modules for transfer learning, our AdapterHub Playground provides an intuitive interface, allowing the usage of adapters for prediction, training and analysis of textual data for a variety of NLP tasks. We present the tool's architecture and demonstrate its advantages with prototypical use-cases, where we show that predictive performance can easily be increased in a few-shot learning scenario. Finally, we evaluate its usability in a user study. We provide the code and a live interface at https://adapter-hub.github.io/playground.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/06/2021

GPT-3 Models are Poor Few-Shot Learners in the Biomedical Domain

Deep neural language models have set new breakthroughs in many tasks of ...
research
08/22/2019

Improving Few-shot Text Classification via Pretrained Language Representations

Text classification tends to be difficult when the data is deficient or ...
research
04/26/2023

Harnessing the Power of LLMs in Practice: A Survey on ChatGPT and Beyond

This paper presents a comprehensive and practical guide for practitioner...
research
12/02/2019

EduBERT: Pretrained Deep Language Models for Learning Analytics

The use of large pretrained neural networks to create contextualized wor...
research
10/09/2019

HuggingFace's Transformers: State-of-the-art Natural Language Processing

Recent advances in modern Natural Language Processing (NLP) research hav...
research
07/23/2023

Validation of a Zero-Shot Learning Natural Language Processing Tool for Data Abstraction from Unstructured Healthcare Data

Objectives: To describe the development and validation of a zero-shot le...
research
07/21/2021

Design of a Graphical User Interface for Few-Shot Machine Learning Classification of Electron Microscopy Data

The recent growth in data volumes produced by modern electron microscope...

Please sign up or login with your details

Forgot password? Click here to reset