GrateTile: Efficient Sparse Tensor Tiling for CNN Processing

09/18/2020
by   Yu-Sheng Lin, et al.
0

We propose GrateTile, an efficient, hardwarefriendly data storage scheme for sparse CNN feature maps (activations). It divides data into uneven-sized subtensors and, with small indexing overhead, stores them in a compressed yet randomly accessible format. This design enables modern CNN accelerators to fetch and decompressed sub-tensors on-the-fly in a tiled processing manner. GrateTile is suitable for architectures that favor aligned, coalesced data access, and only requires minimal changes to the overall architectural design. We simulate GrateTile with state-of-the-art CNNs and show an average of 55 DRAM bandwidth reduction while using only 0.6 storage.

READ FULL TEXT

page 1

page 2

page 6

research
10/15/2019

Refresh Triggered Computation: Improving the Energy Efficiency of Convolutional Neural Network Accelerators

Recently, many studies proposed CNN accelerator architectures with custo...
research
10/12/2022

Optimizing Tensor Programs on Flexible Storage

Tensor programs often need to process large tensors (vectors, matrices, ...
research
07/12/2022

Photonic Reconfigurable Accelerators for Efficient Inference of CNNs with Mixed-Sized Tensors

Photonic Microring Resonator (MRR) based hardware accelerators have been...
research
01/29/2022

Efficient, Out-of-Memory Sparse MTTKRP on Massively Parallel Architectures

Tensor decomposition (TD) is an important method for extracting latent i...
research
02/20/2021

ALTO: Adaptive Linearized Storage of Sparse Tensors

The analysis of high-dimensional sparse data is becoming increasingly po...
research
04/18/2021

Barrier-Free Large-Scale Sparse Tensor Accelerator (BARISTA) For Convolutional Neural Networks

Convolutional neural networks (CNNs) are emerging as powerful tools for ...
research
09/24/2021

User-Defined Functions for HDF5

Scientific datasets are known for their challenging storage demands and ...

Please sign up or login with your details

Forgot password? Click here to reset