Regression Bugs Are In Your Model! Measuring, Reducing and Analyzing Regressions In NLP Model Updates

05/07/2021
by   Yuqing Xie, et al.
6

Behavior of deep neural networks can be inconsistent between different versions. Regressions during model update are a common cause of concern that often over-weigh the benefits in accuracy or efficiency gain. This work focuses on quantifying, reducing and analyzing regression errors in the NLP model updates. Using negative flip rate as regression measure, we show that regression has a prevalent presence across tasks in the GLUE benchmark. We formulate the regression-free model updates into a constrained optimization problem, and further reduce it into a relaxed form which can be approximately optimized through knowledge distillation training method. We empirically analyze how model ensemble reduces regression. Finally, we conduct CheckList behavioral testing to understand the distribution of regressions across linguistic phenomena, and the efficacy of ensemble and distillation methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/07/2022

Measuring and Reducing Model Update Regression in Structured Prediction for NLP

Recent advance in deep learning has led to rapid adoption of machine lea...
research
11/18/2020

Positive-Congruent Training: Towards Regression-Free Model Updates

Reducing inconsistencies in the behavior of different versions of an AI ...
research
02/04/2023

Improving Prediction Backward-Compatiblility in NLP Model Upgrade with Gated Fusion

When upgrading neural models to a newer version, new errors that were no...
research
12/17/2020

Towards Understanding Ensemble, Knowledge Distillation and Self-Distillation in Deep Learning

We formally study how Ensemble of deep learning models can improve test ...
research
04/05/2023

Self-Distillation for Gaussian Process Regression and Classification

We propose two approaches to extend the notion of knowledge distillation...
research
09/16/2020

Mimic and Conquer: Heterogeneous Tree Structure Distillation for Syntactic NLP

Syntax has been shown useful for various NLP tasks, while existing work ...

Please sign up or login with your details

Forgot password? Click here to reset