Contextual Memory Trees

07/17/2018
by   Wen Sun, et al.
6

We design and study a Contextual Memory Tree (CMT), a learning memory controller that inserts new memories into an experience store of unbounded size. It is designed to efficiently query for memories from that store, supporting logarithmic time insertion and retrieval operations. Hence CMT can be integrated into existing statistical learning algorithms as an augmented memory unit without substantially increasing training and inference computation. We demonstrate the efficacy of CMT by augmenting existing multi-class and multi-label classification algorithms with CMT and observe statistical improvement. We also test CMT learning on several image-captioning tasks to demonstrate that it performs computationally better than a simple nearest neighbors memory system while benefitting from reward learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/26/2022

Retrieval-Augmented Transformer for Image Captioning

Image captioning models aim at connecting Vision and Language by providi...
research
10/27/2018

A no-regret generalization of hierarchical softmax to extreme multi-label classification

Extreme multi-label classification (XMLC) is a problem of tagging an ins...
research
10/25/2022

Eigen Memory Tree

This work introduces the Eigen Memory Tree (EMT), a novel online memory ...
research
07/08/2020

Online probabilistic label trees

We introduce online probabilistic label trees (OPLTs), an algorithm that...
research
06/01/2019

On the computational complexity of the probabilistic label tree algorithms

Label tree-based algorithms are widely used to tackle multi-class and mu...
research
07/20/2021

kNet: A Deep kNN Network To Handle Label Noise

Deep Neural Networks require large amounts of labeled data for their tra...

Please sign up or login with your details

Forgot password? Click here to reset