From Shannon's Channel to Semantic Channel via New Bayes' Formulas for Machine Learning

03/22/2018
by   Chenguang Lu, et al.
0

A group of transition probability functions form a Shannon's channel whereas a group of truth functions form a semantic channel. By the third kind of Bayes' theorem, we can directly convert a Shannon's channel into an optimized semantic channel. When a sample is not big enough, we can use a truth function with parameters to produce the likelihood function, then train the truth function by the conditional sampling distribution. The third kind of Bayes' theorem is proved. A semantic information theory is simply introduced. The semantic information measure reflects Popper's hypothesis-testing thought. The Semantic Information Method (SIM) adheres to maximum semantic information criterion which is compatible with maximum likelihood criterion and Regularized Least Squares criterion. It supports Wittgenstein's view: the meaning of a word lies in its use. Letting the two channels mutually match, we obtain the Channels' Matching (CM) algorithm for machine learning. The CM algorithm is used to explain the evolution of the semantic meaning of natural language, such as "Old age". The semantic channel for medical tests and the confirmation measures of test-positive and test-negative are discussed. The applications of the CM algorithm to semi-supervised learning and non-supervised learning are simply introduced. As a predictive model, the semantic channel fits variable sources and hence can overcome class-imbalance problem. The SIM strictly distinguishes statistical probability and logical probability and uses both at the same time. This method is compatible with the thoughts of Bayes, Fisher, Shannon, Zadeh, Tarski, Davidson, Wittgenstein, and Popper.It is a competitive alternative to Bayesian inference.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/03/2018

From Bayesian Inference to Logical Bayesian Inference: A New Mathematical Frame for Semantic Communication and Machine Learning

Bayesian Inference (BI) uses the Bayes' posterior whereas Logical Bayesi...
research
05/02/2018

Semantic Channel and Shannon's Channel Mutually Match for Multi-Label Classification

A group of transition probability functions form a Shannon's channel whe...
research
01/17/2020

Channels' Confirmation and Predictions' Confirmation: from the Medical Test to the Raven Paradox

After long arguments between positivism and falsificationism, the verifi...
research
01/28/2019

The CM Algorithm for the Maximum Mutual Information Classifications of Unseen Instances

The Maximum Mutual Information (MMI) criterion is different from the Lea...
research
03/09/2023

A Theory for Semantic Communications

Semantic communications, as one of the potential key technologies of the...
research
05/23/2023

Reviewing Evolution of Learning Functions and Semantic Information Measures for Understanding Deep Learning

A new trend in deep learning, represented by Mutual Information Neural E...
research
02/28/2019

Bounds on Bayes Factors for Binomial A/B Testing

Bayes factors, in many cases, have been proven to bridge the classic -va...

Please sign up or login with your details

Forgot password? Click here to reset