The Power of External Memory in Increasing Predictive Model Capacity

01/31/2023
by   Cenk Baykal, et al.
0

One way of introducing sparsity into deep networks is by attaching an external table of parameters that is sparsely looked up at different layers of the network. By storing the bulk of the parameters in the external table, one can increase the capacity of the model without necessarily increasing the inference time. Two crucial questions in this setting are then: what is the lookup function for accessing the table and how are the contents of the table consumed? Prominent methods for accessing the table include 1) using words/wordpieces token-ids as table indices, 2) LSH hashing the token vector in each layer into a table of buckets, and 3) learnable softmax style routing to a table entry. The ways to consume the contents include adding/concatenating to input representation, and using the contents as expert networks that specialize to different inputs. In this work, we conduct rigorous experimental evaluations of existing ideas and their combinations. We also introduce a new method, alternating updates, that enables access to an increased token dimension without increasing the computation time, and demonstrate its effectiveness in language modeling.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/09/2021

Lookup-Table Recurrent Language Models for Long Tail Speech Recognition

We introduce Lookup-Table Language Models (LookupLM), a method for scali...
research
01/30/2023

Alternating Updates for Efficient Transformers

It is well established that increasing scale in deep transformer network...
research
05/05/2023

Optimized Table Tokenization for Table Structure Recognition

Extracting tables from documents is a crucial task in any document conve...
research
05/31/2023

A Sequence-to-Sequence Set Model for Text-to-Table Generation

Recently, the text-to-table generation task has attracted increasing att...
research
05/29/2018

Table-to-Text: Describing Table Region with Natural Language

In this paper, we present a generative model to generate a natural langu...
research
06/08/2021

Hash Layers For Large Sparse Models

We investigate the training of sparse layers that use different paramete...
research
08/12/2019

Two Dimensional Router: Design and Implementation

Higher dimensional classification has attracted more attentions with inc...

Please sign up or login with your details

Forgot password? Click here to reset