ProbLP: A framework for low-precision probabilistic inference

02/27/2021
by   Nimish Shah, et al.
0

Bayesian reasoning is a powerful mechanism for probabilistic inference in smart edge-devices. During such inferences, a low-precision arithmetic representation can enable improved energy efficiency. However, its impact on inference accuracy is not yet understood. Furthermore, general-purpose hardware does not natively support low-precision representation. To address this, we propose ProbLP, a framework that automates the analysis and design of low-precision probabilistic inference hardware. It automatically chooses an appropriate energy-efficient representation based on worst-case error-bounds and hardware energy-models. It generates custom hardware for the resulting inference network exploiting parallelism, pipelining and low-precision operation. The framework is validated on several embedded-sensing benchmarks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/24/2022

Low- and Mixed-Precision Inference Accelerators

With the surging popularity of edge computing, the need to efficiently p...
research
03/01/2018

WRPN & Apprentice: Methods for Training and Inference using Low-Precision Numerics

Today's high performance deep learning architectures involve large model...
research
04/07/2021

On-FPGA Training with Ultra Memory Reduction: A Low-Precision Tensor Method

Various hardware accelerators have been developed for energy-efficient a...
research
04/17/2018

DPRed: Making Typical Activation Values Matter In Deep Learning Computing

We show that selecting a fixed precision for all activations in Convolut...
research
05/26/2021

Low-Precision Hardware Architectures Meet Recommendation Model Inference at Scale

Tremendous success of machine learning (ML) and the unabated growth in M...
research
10/29/2019

Divide, Conquer, and Combine: a New Inference Strategy for Probabilistic Programs with Stochastic Support

Universal probabilistic programming systems (PPSs) provide a powerful an...
research
10/09/2019

QPyTorch: A Low-Precision Arithmetic Simulation Framework

Low-precision training reduces computational cost and produces efficient...

Please sign up or login with your details

Forgot password? Click here to reset