Reducing Model Jitter: Stable Re-training of Semantic Parsers in Production Environments

04/10/2022
by   Christopher Hidey, et al.
9

Retraining modern deep learning systems can lead to variations in model performance even when trained using the same data and hyper-parameters by simply using different random seeds. We call this phenomenon model jitter. This issue is often exacerbated in production settings, where models are retrained on noisy data. In this work we tackle the problem of stable retraining with a focus on conversational semantic parsers. We first quantify the model jitter problem by introducing the model agreement metric and showing the variation with dataset noise and model sizes. We then demonstrate the effectiveness of various jitter reduction techniques such as ensembling and distillation. Lastly, we discuss practical trade-offs between such techniques and show that co-distillation provides a sweet spot in terms of jitter reduction for semantic parsing systems with only a modest increase in resource usage.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/14/2019

Towards Testing of Deep Learning Systems with Training Set Reduction

Testing the implementation of deep learning systems and their training r...
research
10/19/2020

New Properties of the Data Distillation Method When Working With Tabular Data

Data distillation is the problem of reducing the volume oftraining data ...
research
09/13/2023

Résumé Parsing as Hierarchical Sequence Labeling: An Empirical Study

Extracting information from résumés is typically formulated as a two-sta...
research
09/22/2020

Controlling Style in Generated Dialogue

Open-domain conversation models have become good at generating natural-s...
research
04/17/2021

Data Distillation for Text Classification

Deep learning techniques have achieved great success in many fields, whi...
research
02/05/2021

On the Reproducibility of Neural Network Predictions

Standard training techniques for neural networks involve multiple source...

Please sign up or login with your details

Forgot password? Click here to reset