The Role of Interactive Visualization in Explaining (Large) NLP Models: from Data to Inference

01/11/2023
by   Richard Brath, et al.
0

With a constant increase of learned parameters, modern neural language models become increasingly more powerful. Yet, explaining these complex model's behavior remains a widely unsolved problem. In this paper, we discuss the role interactive visualization can play in explaining NLP models (XNLP). We motivate the use of visualization in relation to target users and common NLP pipelines. We also present several use cases to provide concrete examples on XNLP with visualization. Finally, we point out an extensive list of research opportunities in this field.

READ FULL TEXT

page 2

page 3

page 5

research
04/16/2020

A Workflow Manager for Complex NLP and Content Curation Pipelines

We present a workflow manager for the flexible creation and customisatio...
research
07/12/2019

Narvis: Authoring Narrative Slideshows for Introducing Data Visualization Designs

Visual designs can be complex in modern data visualization systems, whic...
research
09/21/2018

Understanding Convolutional Neural Networks for Text Classification

We present an analysis into the inner workings of Convolutional Neural N...
research
11/16/2016

Embedding Projector: Interactive Visualization and Interpretation of Embeddings

Embeddings are ubiquitous in machine learning, appearing in recommender ...
research
09/05/2020

Polyphorm: Structural Analysis of Cosmological Datasets via Interactive Physarum Polycephalum Visualization

This paper introduces Polyphorm, an interactive visualization and model ...
research
09/04/2008

On the role of metaphor in information visualization

The concept of metaphor, in particular graphical (or visual) metaphor, i...
research
10/21/2018

Visualization Framework for Colonoscopy Videos

We present a visualization framework for annotating and comparing colono...

Please sign up or login with your details

Forgot password? Click here to reset