Generalized Twin Gaussian Processes using Sharma-Mittal Divergence

09/26/2014
by   Mohamed Elhoseiny, et al.
0

There has been a growing interest in mutual information measures due to their wide range of applications in Machine Learning and Computer Vision. In this paper, we present a generalized structured regression framework based on Shama-Mittal divergence, a relative entropy measure, which is introduced to the Machine Learning community in this work. Sharma-Mittal (SM) divergence is a generalized mutual information measure for the widely used Rényi, Tsallis, Bhattacharyya, and Kullback-Leibler (KL) relative entropies. Specifically, we study Sharma-Mittal divergence as a cost function in the context of the Twin Gaussian Processes (TGP) Bo:2010, which generalizes over the KL-divergence without computational penalty. We show interesting properties of Sharma-Mittal TGP (SMTGP) through a theoretical analysis, which covers missing insights in the traditional TGP formulation. However, we generalize this theory based on SM-divergence instead of KL-divergence which is a special case. Experimentally, we evaluated the proposed SMTGP framework on several datasets. The results show that SMTGP reaches better predictions than KL-based TGP, since it offers a bigger class of models through its parameters that we learn from the data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/24/2019

Relation between the Kantorovich-Wasserstein metric and the Kullback-Leibler divergence

We discuss a relation between the Kantorovich-Wasserstein (KW) metric an...
research
07/19/2023

GenKL: An Iterative Framework for Resolving Label Ambiguity and Label Non-conformity in Web Images Via a New Generalized KL Divergence

Web image datasets curated online inherently contain ambiguous in-distri...
research
08/13/2018

Stealth Attacks on the Smart Grid

Random attacks that jointly minimize the amount of information acquired ...
research
09/13/2022

Rényi Divergence Deep Mutual Learning

This paper revisits an incredibly simple yet exceedingly effective compu...
research
02/05/2021

Estimating 2-Sinkhorn Divergence between Gaussian Processes from Finite-Dimensional Marginals

Optimal Transport (OT) has emerged as an important computational tool in...
research
06/01/2023

Domain Selection for Gaussian Process Data: An application to electrocardiogram signals

Gaussian Processes and the Kullback-Leibler divergence have been deeply ...
research
02/26/2018

Principled Bayesian Minimum Divergence Inference

When it is acknowledged that all candidate parameterised statistical mod...

Please sign up or login with your details

Forgot password? Click here to reset