Using Fast Weights to Attend to the Recent Past

10/20/2016
by   Jimmy Ba, et al.
0

Until recently, research on artificial neural networks was largely restricted to systems with only two types of variable: Neural activities that represent the current or recent input and weights that learn to capture regularities among inputs, outputs and payoffs. There is no good reason for this restriction. Synapses have dynamics at many different time-scales and this suggests that artificial neural networks might benefit from variables that change slower than activities but much faster than the standard weights. These "fast weights" can be used to store temporary memories of the recent past and they provide a neurally plausible way of implementing the type of attention to the past that has recently proved very helpful in sequence-to-sequence models. By using fast weights we can avoid the need to store copies of neural activity patterns.

READ FULL TEXT
research
06/04/2014

Multi-task Neural Networks for QSAR Predictions

Although artificial neural networks have occasionally been used for Quan...
research
12/13/2022

Temporal Weights

In artificial neural networks, weights are a static representation of sy...
research
09/16/2016

Learning Opposites Using Neural Networks

Many research works have successfully extended algorithms such as evolut...
research
04/01/2021

Replay in Deep Learning: Current Approaches and Missing Biological Elements

Replay is the reactivation of one or more neural patterns, which are sim...
research
03/05/2021

Artificial Neural Networks generated by Low Discrepancy Sequences

Artificial neural networks can be represented by paths. Generated as ran...
research
08/29/2023

Vector Search with OpenAI Embeddings: Lucene Is All You Need

We provide a reproducible, end-to-end demonstration of vector search wit...

Please sign up or login with your details

Forgot password? Click here to reset