On Resource-Efficient Bayesian Network Classifiers and Deep Neural Networks

10/22/2020
by   Wolfgang Roth, et al.
0

We present two methods to reduce the complexity of Bayesian network (BN) classifiers. First, we introduce quantization-aware training using the straight-through gradient estimator to quantize the parameters of BNs to few bits. Second, we extend a recently proposed differentiable tree-augmented naive Bayes (TAN) structure learning approach by also considering the model size. Both methods are motivated by recent developments in the deep learning community, and they provide effective means to trade off between model size and prediction accuracy, which is demonstrated in extensive experiments. Furthermore, we contrast quantized BN classifiers with quantized deep neural networks (DNNs) for small-scale scenarios which have hardly been investigated in the literature. We show Pareto optimal models with respect to model size, number of operations, and test error and find that both model classes are viable options.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2022

Edge Inference with Fully Differentiable Quantized Mixed Precision Neural Networks

The large computing and memory cost of deep neural networks (DNNs) often...
research
05/27/2019

Differentiable Quantization of Deep Neural Networks

We propose differentiable quantization (DQ) for efficient deep neural ne...
research
01/04/2019

Dataflow-based Joint Quantization of Weights and Activations for Deep Neural Networks

This paper addresses a challenging problem - how to reduce energy consum...
research
11/20/2015

Resiliency of Deep Neural Networks under Quantization

The complexity of deep neural network algorithms for hardware implementa...
research
04/23/2020

Quantaized Winograd/Toom-Cook Convolution for DNNs: Beyond Canonical Polynomials Base

The problem how to speed up the convolution computations in Deep Neural ...
research
12/06/2019

Sampling-Free Learning of Bayesian Quantized Neural Networks

Bayesian learning of model parameters in neural networks is important in...
research
06/20/2012

Learning Selectively Conditioned Forest Structures with Applications to DBNs and Classification

Dealing with uncertainty in Bayesian Network structures using maximum a ...

Please sign up or login with your details

Forgot password? Click here to reset