DeepAI AI Chat
Log In Sign Up

Learning Memory Access Patterns

by   Milad Hashemi, et al.

The explosion in workload complexity and the recent slow-down in Moore's law scaling call for new approaches towards efficient computing. Researchers are now beginning to use recent advances in machine learning in software optimizations, augmenting or replacing traditional heuristics and data structures. However, the space of machine learning for computer hardware architecture is only lightly explored. In this paper, we demonstrate the potential of deep learning to address the von Neumann bottleneck of memory performance. We focus on the critical problem of learning memory access patterns, with the goal of constructing accurate and efficient memory prefetchers. We relate contemporary prefetching strategies to n-gram models in natural language processing, and show how recurrent neural networks can serve as a drop-in replacement. On a suite of challenging benchmark datasets, we find that neural networks consistently demonstrate superior performance in terms of precision and recall. This work represents the first step towards practical neural-network based prefetching, and opens a wide range of exciting directions for machine learning in computer architecture research.


page 5

page 7

page 15


Learning to Transduce with Unbounded Memory

Recently, strong results have been demonstrated by Deep Recurrent Neural...

Sequence Generation using Deep Recurrent Networks and Embeddings: A study case in music

Automatic generation of sequences has been a highly explored field in th...

Syntactically Informed Text Compression with Recurrent Neural Networks

We present a self-contained system for constructing natural language mod...

Neural Memory Networks for Seizure Type Classification

Classification of seizure type is a key step in the clinical process for...

Neural Memory Networks for Robust Classification of Seizure Type

Classification of seizure type is a key step in the clinical process for...

The Deep Learning Revolution and Its Implications for Computer Architecture and Chip Design

The past decade has seen a remarkable series of advances in machine lear...

Robust High-dimensional Memory-augmented Neural Networks

Traditional neural networks require enormous amounts of data to build th...