It was the training data pruning too!

We study the current best model (KDG) for question answering on tabular data evaluated over the WikiTableQuestions dataset. Previous ablation studies performed against this model attributed the model's performance to certain aspects of its architecture. In this paper, we find that the model's performance also crucially depends on a certain pruning of the data used to train the model. Disabling the pruning step drops the accuracy of the model from 43.3 suggests that the pruning may be a useful pre-processing step in training other semantic parsers as well.

READ FULL TEXT

page 1

page 2

page 3

research
05/19/2022

Dataset Pruning: Reducing Training Data by Examining Generalization Influence

The great success of deep learning heavily relies on increasingly larger...
research
10/11/2018

Rethinking the Value of Network Pruning

Network pruning is widely used for reducing the heavy computational cost...
research
07/08/2022

Pruning Early Exit Networks

Deep learning models that perform well often have high computational cos...
research
01/24/2019

Really should we pruning after model be totally trained? Pruning based on a small amount of training

Pre-training of models in pruning algorithms plays an important role in ...
research
10/17/2022

Principled Pruning of Bayesian Neural Networks through Variational Free Energy Minimization

Bayesian model reduction provides an efficient approach for comparing th...
research
10/08/2021

Performance optimizations on deep noise suppression models

We study the role of magnitude structured pruning as an architecture sea...
research
11/21/2018

Graph-Adaptive Pruning for Efficient Inference of Convolutional Neural Networks

In this work, we propose a graph-adaptive pruning (GAP) method for effic...

Please sign up or login with your details

Forgot password? Click here to reset