Sustainable AI Processing at the Edge

07/04/2022
by   Sebastien Ollivier, et al.
0

Edge computing is a popular target for accelerating machine learning algorithms supporting mobile devices without requiring the communication latencies to handle them in the cloud. Edge deployments of machine learning primarily consider traditional concerns such as SWaP constraints (Size, Weight, and Power) for their installations. However, such metrics are not entirely sufficient to consider environmental impacts from computing given the significant contributions from embodied energy and carbon. In this paper we explore the tradeoffs of convolutional neural network acceleration engines for both inference and on-line training. In particular, we explore the use of processing-in-memory (PIM) approaches, mobile GPU accelerators, and recently released FPGAs, and compare them with novel Racetrack memory PIM. Replacing PIM-enabled DDR3 with Racetrack memory PIM can recover its embodied energy as quickly as 1 year. For high activity ratios, mobile GPUs can be more sustainable but have higher embodied energy to overcome compared to PIM-enabled Racetrack memory.

READ FULL TEXT
research
03/27/2020

AI on the Edge: Rethinking AI-based IoT Applications Using Specialized Edge Architectures

Edge computing has emerged as a popular paradigm for supporting mobile a...
research
02/13/2019

Training on the Edge: The why and the how

Edge computing is the natural progression from Cloud computing, where, i...
research
06/03/2017

MobiRNN: Efficient Recurrent Neural Network Execution on Mobile GPU

In this paper, we explore optimizations to run Recurrent Neural Network ...
research
01/03/2019

HG-Caffe: Mobile and Embedded Neural Network GPU (OpenCL) Inference Engine with FP16 Supporting

Breakthroughs in the fields of deep learning and mobile system-on-chips ...
research
09/30/2019

EdgeCNN: Convolutional Neural Network Classification Model with small inputs for Edge Computing

With the development of Internet of Things (IoT), data is increasingly a...
research
08/24/2019

Neural Network Inference on Mobile SoCs

The ever-increasing demand from mobile Machine Learning (ML) application...
research
07/02/2020

Efficient Neural Network Deployment for Microcontroller

Edge computing for neural networks is getting important especially for l...

Please sign up or login with your details

Forgot password? Click here to reset