Parallelizing Probabilistic Inference: Some Early Explorations

03/13/2013
by   Bruce D'Ambrosio, et al.
0

We report on an experimental investigation into opportunities for parallelism in beliefnet inference. Specifically, we report on a study performed of the available parallelism, on hypercube style machines, of a set of randomly generated belief nets, using factoring (SPI) style inference algorithms. Our results indicate that substantial speedup is available, but that it is available only through parallelization of individual conformal product operations, and depends critically on finding an appropriate factoring. We find negligible opportunity for parallelism at the topological, or clustering tree, level.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 6

page 7

page 8

research
06/22/2020

LAMP: Large Deep Nets with Automated Model Parallelism for Image Segmentation

Deep Learning (DL) models are becoming larger, because the increase in m...
research
11/14/2015

8-Bit Approximations for Parallelism in Deep Learning

The creation of practical deep learning data-products often requires par...
research
04/11/2020

Bit-Parallel Vector Composability for Neural Acceleration

Conventional neural accelerators rely on isolated self-sufficient functi...
research
09/15/2023

Neural Network Exemplar Parallelization with Go

This paper presents a case for exemplar parallelism of neural networks u...
research
12/31/2020

An Order-aware Dataflow Model for Extracting Shell Script Parallelism

We present a dataflow model for extracting data parallelism latent in Un...
research
01/22/2019

SVE-enabling Lattice QCD Codes

Optimization of applications for supercomputers of the highest performan...

Please sign up or login with your details

Forgot password? Click here to reset