Robust High-dimensional Memory-augmented Neural Networks

10/05/2020 ∙ by Geethan Karunaratne, et al. ∙ 1

Traditional neural networks require enormous amounts of data to build their complex mappings during a slow training procedure that hinders their abilities for relearning and adapting to new data. Memory-augmented neural networks enhance neural networks with an explicit memory to overcome these issues. Access to this explicit memory, however, occurs via soft read and write operations involving every individual memory entry, resulting in a bottleneck when implemented using the conventional von Neumann computer architecture. To overcome this bottleneck, we propose a robust architecture that employs a computational memory unit as the explicit memory performing analog in-memory computation on high-dimensional vectors, while closely matching 32-bit software-equivalent accuracy. This is enabled by a content-based attention mechanism that represents unrelated items in the computational memory with uncorrelated high-dimensional vectors, whose real-valued components can be readily approximated by binary, or bipolar components. Experimental results demonstrate the efficacy of our approach on few-shot image classification tasks on the Omniglot dataset using more than 256,000 phase-change memory devices.



There are no comments yet.


page 13

page 14

page 15

page 16

page 17

page 20

page 29

page 31

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.