DeepAI AI Chat
Log In Sign Up

Wasserstein Barycenter Model Ensembling

02/13/2019
by   Pierre Dognin, et al.
0

In this paper we propose to perform model ensembling in a multiclass or a multilabel learning setting using Wasserstein (W.) barycenters. Optimal transport metrics, such as the Wasserstein distance, allow incorporating semantic side information such as word embeddings. Using W. barycenters to find the consensus between models allows us to balance confidence and semantics in finding the agreement between the models. We show applications of Wasserstein ensembling in attribute-based classification, multilabel learning and image captioning generation. These results show that the W. ensembling is a viable alternative to the basic geometric or arithmetic mean ensembling.

READ FULL TEXT

page 15

page 17

08/27/2021

Automatic Text Evaluation through the Lens of Wasserstein Barycenters

A new metric to evaluate text generation based on deep contextualized e...
07/22/2022

Exploring Wasserstein Distance across Concept Embeddings for Ontology Matching

Measuring the distance between ontological elements is a fundamental com...
04/08/2019

Pushing the right boundaries matters! Wasserstein Adversarial Training for Label Noise

Noisy labels often occur in vision datasets, especially when they are is...
05/08/2019

Learning Embeddings into Entropic Wasserstein Spaces

Euclidean embeddings of data are fundamentally limited in their ability ...
02/21/2017

Differential Geometric Retrieval of Deep Features

Comparing images to recommend items from an image-inventory is a subject...
09/30/2022

Finding NEEMo: Geometric Fitting using Neural Estimation of the Energy Mover's Distance

A novel neural architecture was recently developed that enforces an exac...
05/28/2018

Bayesian Learning with Wasserstein Barycenters

In this work we introduce a novel paradigm for Bayesian learning based o...