Improving Interpretability via Explicit Word Interaction Graph Layer

02/03/2023
by   Arshdeep Sekhon, et al.
0

Recent NLP literature has seen growing interest in improving model interpretability. Along this direction, we propose a trainable neural network layer that learns a global interaction graph between words and then selects more informative words using the learned word interactions. Our layer, we call WIGRAPH, can plug into any neural network-based NLP text classifiers right after its word embedding layer. Across multiple SOTA NLP models and various NLP datasets, we demonstrate that adding the WIGRAPH layer substantially improves NLP models' interpretability and enhances models' prediction performance at the same time.

READ FULL TEXT

page 3

page 14

research
06/01/2020

Attention Word Embedding

Word embedding models learn semantically rich vector representations of ...
research
10/01/2020

Learning Variational Word Masks to Improve the Interpretability of Neural Text Classifiers

To build an interpretable neural text classifier, most of the prior work...
research
07/22/2019

Sparsity Emerges Naturally in Neural Language Models

Concerns about interpretability, computational resources, and principled...
research
05/28/2019

Interpreting and improving natural-language processing (in machines) with natural language-processing (in the brain)

Neural network models for NLP are typically implemented without the expl...
research
09/24/2019

Attention Interpretability Across NLP Tasks

The attention layer in a neural network model provides insights into the...
research
03/02/2020

Toward Interpretability of Dual-Encoder Models for Dialogue Response Suggestions

This work shows how to improve and interpret the commonly used dual enco...
research
10/14/2021

WMDecompose: A Framework for Leveraging the Interpretable Properties of Word Mover's Distance in Sociocultural Analysis

Despite the increasing popularity of NLP in the humanities and social sc...

Please sign up or login with your details

Forgot password? Click here to reset