Bayesian Neural Networks with Soft Evidence

10/19/2020
by   Edward Yu, et al.
0

Bayes's rule deals with hard evidence, that is, we can calculate the probability of event A occuring given that event B has occurred. Soft evidence, on the other hand, involves a degree of uncertainty about whether event B has actually occurred or not. Jeffrey's rule of conditioning provides a way to update beliefs in the case of soft evidence. We provide a framework to learn a probability distribution on the weights of a neural network trained using soft evidence by way of two simple algorithms for approximating Jeffrey conditionalization. We propose an experimental protocol for benchmarking these algorithms on empirical datasets, even when the data is purposely corrupted.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/08/2017

What Does a Belief Function Believe In ?

The conditioning in the Dempster-Shafer Theory of Evidence has been defi...
research
07/15/2018

A Mathematical Account of Soft Evidence, and of Jeffrey's `destructive' versus Pearl's `constructive' updating

Evidence in probabilistic reasoning may be `hard' or `soft', that is, it...
research
07/24/2018

The Soft Multivariate Truncated Normal Distribution

We propose a new distribution, called the soft tMVN distribution, which ...
research
05/16/2003

Finding a Posterior Domain Probability Distribution by Specifying Nonspecific Evidence

This article is an extension of the results of two earlier articles. In ...
research
03/27/2013

Metaprobability and Dempster-Shafer in Evidential Reasoning

Evidential reasoning in expert systems has often used ad-hoc uncertainty...
research
03/27/2013

On the Combinality of Evidence in the Dempster-Shafer Theory

In the current versions of the Dempster-Shafer theory, the only essentia...
research
03/20/2013

Reasoning with Mass Distributions

The concept of movable evidence masses that flow from supersets to subse...

Please sign up or login with your details

Forgot password? Click here to reset