From Bayesian Inference to Logical Bayesian Inference: A New Mathematical Frame for Semantic Communication and Machine Learning

09/03/2018
by   Chenguang Lu, et al.
0

Bayesian Inference (BI) uses the Bayes' posterior whereas Logical Bayesian Inference (LBI) uses the truth function or membership function as the inference tool. LBI was proposed because BI was not compatible with the classical Bayes' prediction and didn't use logical probability and hence couldn't express semantic meaning. In LBI, statistical probability and logical probability are strictly distinguished, used at the same time, and linked by the third kind of Bayes' Theorem. The Shannon channel consists of a set of transition probability functions whereas the semantic channel consists of a set of truth functions. When a sample is large enough, we can directly derive the semantic channel from Shannon's channel. Otherwise, we can use parameters to construct truth functions and use the Maximum Semantic Information (MSI) criterion to optimize the truth functions. The MSI criterion is equivalent to the Maximum Likelihood (ML) criterion, and compatible with the Regularized Least Square (RLS) criterion. By matching the two channels one with another, we can obtain the Channels' Matching (CM) algorithm. This algorithm can improve multi-label classifications, maximum likelihood estimations (including unseen instance classifications), and mixture models. In comparison with BI, LBI 1) uses the prior P(X) of X instead of that of Y or θ and fits cases where the source P(X) changes, 2) can be used to solve the denotations of labels, and 3) is more compatible with the classical Bayes' prediction and likelihood method. LBI also provides a confirmation measure between -1 and 1 for induction.

READ FULL TEXT
research
03/22/2018

From Shannon's Channel to Semantic Channel via New Bayes' Formulas for Machine Learning

A group of transition probability functions form a Shannon's channel whe...
research
05/02/2018

Semantic Channel and Shannon's Channel Mutually Match for Multi-Label Classification

A group of transition probability functions form a Shannon's channel whe...
research
01/17/2020

Channels' Confirmation and Predictions' Confirmation: from the Medical Test to the Raven Paradox

After long arguments between positivism and falsificationism, the verifi...
research
01/28/2019

The CM Algorithm for the Maximum Mutual Information Classifications of Unseen Instances

The Maximum Mutual Information (MMI) criterion is different from the Lea...
research
01/28/2014

Bayesian Properties of Normalized Maximum Likelihood and its Fast Computation

The normalized maximized likelihood (NML) provides the minimax regret so...
research
09/29/2015

Tractable Fully Bayesian Inference via Convex Optimization and Optimal Transport Theory

We consider the problem of transforming samples from one continuous sour...
research
01/19/2022

Bayesian Inference with Nonlinear Generative Models: Comments on Secure Learning

Unlike the classical linear model, nonlinear generative models have been...

Please sign up or login with your details

Forgot password? Click here to reset