Magnetoresistive RAM for error resilient XNOR-Nets

05/24/2019
by   Michail Tzoufras, et al.
0

We trained three Binarized Convolutional Neural Network architectures (LeNet-4, Network-In-Network, AlexNet) on a variety of datasets (MNIST, CIFAR-10, CIFAR-100, extended SVHN, ImageNet) using error-prone activations and tested them without errors to study the resilience of the training process. With the exception of the AlexNet when trained on the ImageNet dataset, we found that Bit Error Rates of a few percent during training do not degrade the test accuracy. Furthermore, by training the AlexNet on progressively smaller subsets of ImageNet classes, we observed increasing tolerance to activation errors. The ability to operate with high BERs is critical for reducing power consumption in existing hardware and for facilitating emerging memory technologies. We discuss how operating at moderate BER can enable Magnetoresistive RAM with higher endurance, speed and density.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/07/2020

Switchable Precision Neural Networks

Instantaneous and on demand accuracy-efficiency trade-off has been recen...
research
02/23/2018

Training wide residual networks for deployment using a single bit for each weight

For fast and energy-efficient deployment of trained deep neural networks...
research
11/01/2017

Towards Effective Low-bitwidth Convolutional Neural Networks

This paper tackles the problem of training a deep convolutional neural n...
research
01/23/2019

Backprop with Approximate Activations for Memory-efficient Network Training

Larger and deeper neural network architectures deliver improved accuracy...
research
10/02/2018

CINIC-10 is not ImageNet or CIFAR-10

In this brief technical report we introduce the CINIC-10 dataset as a pl...
research
03/04/2019

CodeNet: Training Large Scale Neural Networks in Presence of Soft-Errors

This work proposes the first strategy to make distributed training of ne...
research
06/22/2023

Squeeze, Recover and Relabel: Dataset Condensation at ImageNet Scale From A New Perspective

We present a new dataset condensation framework termed Squeeze, Recover ...

Please sign up or login with your details

Forgot password? Click here to reset