Sparsity Emerges Naturally in Neural Language Models

07/22/2019
by   Naomi Saphra, et al.
0

Concerns about interpretability, computational resources, and principled inductive priors have motivated efforts to engineer sparse neural models for NLP tasks. If sparsity is important for NLP, might well-trained neural models naturally become roughly sparse? Using the Taxi-Euclidean norm to measure sparsity, we find that frequent input words are associated with concentrated or sparse activations, while frequent target words are associated with dispersed activations but concentrated gradients. We find that gradients associated with function words are more concentrated than the gradients of content words, even controlling for word frequency.

READ FULL TEXT
research
05/10/2022

Problems with Cosine as a Measure of Embedding Similarity for High Frequency Words

Cosine similarity of contextual embeddings is used in many NLP tasks (e....
research
02/03/2023

Improving Interpretability via Explicit Word Interaction Graph Layer

Recent NLP literature has seen growing interest in improving model inter...
research
10/12/2020

Gradient-based Analysis of NLP Models is Manipulable

Gradient-based analysis methods, such as saliency map visualizations and...
research
09/03/2017

Investigating how well contextual features are captured by bi-directional recurrent neural network models

Learning algorithms for natural language processing (NLP) tasks traditio...
research
03/21/2022

Optimal Fine-Grained N:M sparsity for Activations and Neural Gradients

In deep learning, fine-grained N:M sparsity reduces the data footprint a...
research
04/08/2020

Which one is the dax? Achieving mutual exclusivity with neural networks

Learning words is a challenge for children and neural networks alike. Ho...
research
04/02/2021

Humor@IITK at SemEval-2021 Task 7: Large Language Models for Quantifying Humor and Offensiveness

Humor and Offense are highly subjective due to multiple word senses, cul...

Please sign up or login with your details

Forgot password? Click here to reset