TextBrewer: An Open-Source Knowledge Distillation Toolkit for Natural Language Processing

02/28/2020
by   Ziqing Yang, et al.
0

In this paper, we introduce TextBrewer, an open-source knowledge distillation toolkit designed for natural language processing. It works with different neural network models and supports various kinds of tasks, such as text classification, reading comprehension, sequence labeling. TextBrewer provides a simple and uniform workflow that enables quick setup of distillation experiments with highly flexible configurations. It offers a set of predefined distillation methods and can be extended with custom code. As a case study, we use TextBrewer to distill BERT on several typical NLP tasks. With simple configuration, we achieve results that are comparable with or even higher than the state-of-the-art performance. Our toolkit is available through: http://textbrewer.hfl-rc.com

READ FULL TEXT
research
04/30/2022

EasyNLP: A Comprehensive and Easy-to-use Toolkit for Natural Language Processing

The success of Pre-Trained Models (PTMs) has reshaped the development of...
research
06/14/2018

NCRF++: An Open-source Neural Sequence Labeling Toolkit

This paper describes NCRF++, a toolkit for neural sequence labeling. NCR...
research
08/02/2019

Self-Knowledge Distillation in Natural Language Processing

Since deep learning became a key player in natural language processing (...
research
06/14/2021

Launching into clinical space with medspaCy: a new clinical text processing toolkit in Python

Despite impressive success of machine learning algorithms in clinical na...
research
11/16/2020

NLPGym – A toolkit for evaluating RL agents on Natural Language Processing Tasks

Reinforcement learning (RL) has recently shown impressive performance in...
research
09/20/2023

Language-Oriented Communication with Semantic Coding and Knowledge Distillation for Text-to-Image Generation

By integrating recent advances in large language models (LLMs) and gener...
research
04/21/2019

NeuronBlocks -- Building Your NLP DNN Models Like Playing Lego

When building deep neural network models for natural language processing...

Please sign up or login with your details

Forgot password? Click here to reset