A precise bare simulation approach to the minimization of some distances. Foundations

07/04/2021
by   Michel Broniatowski, et al.
0

In information theory – as well as in the adjacent fields of statistics, machine learning, artificial intelligence, signal processing and pattern recognition – many flexibilizations of the omnipresent Kullback-Leibler information distance (relative entropy) and of the closely related Shannon entropy have become frequently used tools. To tackle corresponding constrained minimization (respectively maximization) problems by a newly developed dimension-free bare (pure) simulation method, is the main goal of this paper. Almost no assumptions (like convexity) on the set of constraints are needed, within our discrete setup of arbitrary dimension, and our method is precise (i.e., converges in the limit). As a side effect, we also derive an innovative way of constructing new useful distances/divergences. To illustrate the core of our approach, we present numerous examples. The potential for widespread applicability is indicated, too; in particular, we deliver many recent references for uses of the involved distances/divergences and entropies in various different research fields (which may also serve as an interdisciplinary interface).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/02/2022

A Unifying Framework for Some Directed Distances in Statistics

Density-based directed distances – particularly known as divergences – b...
research
09/23/2020

A Linear Transportation L^p Distance for Pattern Recognition

The transportation L^p distance, denoted TL^p, has been proposed as a ge...
research
03/27/2013

Relative Entropy, Probabilistic Inference and AI

Various properties of relative entropy have led to its widespread use in...
research
02/12/2021

AI Uncertainty Based on Rademacher Complexity and Shannon Entropy

In this paper from communication channel coding perspective we are able ...
research
03/01/2018

Re-examination of Bregman functions and new properties of their divergences

The Bregman divergence (Bregman distance, Bregman measure of distance) i...
research
08/07/2018

Grassmannian Learning: Embedding Geometry Awareness in Shallow and Deep Learning

Modern machine learning algorithms have been adopted in a range of signa...
research
09/05/2016

Reflections on Shannon Information: In search of a natural information-entropy for images

It is not obvious how to extend Shannon's original information entropy t...

Please sign up or login with your details

Forgot password? Click here to reset