Relative Entropy, Probabilistic Inference and AI

03/27/2013
by   John E. Shore, et al.
0

Various properties of relative entropy have led to its widespread use in information theory. These properties suggest that relative entropy has a role to play in systems that attempt to perform inference in terms of probability distributions. In this paper, I will review some basic properties of relative entropy as well as its role in probabilistic inference. I will also mention briefly a few existing and potential applications of relative entropy to so-called artificial intelligence (AI).

READ FULL TEXT

page 2

page 3

page 4

page 5

research
12/07/2021

On a 2-relative entropy

We construct a 2-categorical extension of the relative entropy functor o...
research
02/17/2022

Information Theory with Kernel Methods

We consider the analysis of probability distributions through their asso...
research
10/22/2018

A Family of Statistical Divergences Based on Quasiarithmetic Means

This paper proposes a generalization of Tsallis entropy and Tsallis rela...
research
03/06/2013

Mixtures of Gaussians and Minimum Relative Entropy Techniques for Modeling Continuous Uncertainties

Problems of probabilistic inference and decision making under uncertaint...
research
07/04/2021

A precise bare simulation approach to the minimization of some distances. Foundations

In information theory – as well as in the adjacent fields of statistics,...
research
05/01/2021

t-Entropy: A New Measure of Uncertainty with Some Applications

The concept of Entropy plays a key role in Information Theory, Statistic...
research
12/14/2022

Relative position between a pair of spin model subfactors

We start with a pair of distinct 2× 2 complex Hadamard matrices and comp...

Please sign up or login with your details

Forgot password? Click here to reset