An Interpretability Illusion for BERT

04/14/2021
by   Tolga Bolukbasi, et al.
0

We describe an "interpretability illusion" that arises when analyzing the BERT model. Activations of individual neurons in the network may spuriously appear to encode a single, simple concept, when in fact they are encoding something far more complex. The same effect holds for linear combinations of activations. We trace the source of this illusion to geometric properties of BERT's embedding space as well as the fact that common text corpora represent only narrow slices of possible English sentences. We provide a taxonomy of model-learned concepts and discuss methodological implications for interpretability research, especially the importance of testing hypotheses on multiple data sets.

READ FULL TEXT
research
11/28/2020

Understanding How BERT Learns to Identify Edits

Pre-trained transformer language models such as BERT are ubiquitous in N...
research
04/19/2023

Disentangling Neuron Representations with Concept Vectors

Mechanistic interpretability aims to understand how models store represe...
research
09/15/2023

Sparse Autoencoders Find Highly Interpretable Features in Language Models

One of the roadblocks to a better understanding of neural networks' inte...
research
05/11/2021

Integrating extracted information from bert and multiple embedding methods with the deep neural network for humour detection

Humour detection from sentences has been an interesting and challenging ...
research
05/14/2021

Cause and Effect: Concept-based Explanation of Neural Networks

In many scenarios, human decisions are explained based on some high-leve...
research
10/04/2022

Polysemanticity and Capacity in Neural Networks

Individual neurons in neural networks often represent a mixture of unrel...
research
07/02/2021

He Thinks He Knows Better than the Doctors: BERT for Event Factuality Fails on Pragmatics

We investigate how well BERT performs on predicting factuality in severa...

Please sign up or login with your details

Forgot password? Click here to reset