DeepAI AI Chat
Log In Sign Up

Wasserstein Barycenter Model Ensembling

by   Pierre Dognin, et al.

In this paper we propose to perform model ensembling in a multiclass or a multilabel learning setting using Wasserstein (W.) barycenters. Optimal transport metrics, such as the Wasserstein distance, allow incorporating semantic side information such as word embeddings. Using W. barycenters to find the consensus between models allows us to balance confidence and semantics in finding the agreement between the models. We show applications of Wasserstein ensembling in attribute-based classification, multilabel learning and image captioning generation. These results show that the W. ensembling is a viable alternative to the basic geometric or arithmetic mean ensembling.


page 15

page 17


Automatic Text Evaluation through the Lens of Wasserstein Barycenters

A new metric to evaluate text generation based on deep contextualized e...

Exploring Wasserstein Distance across Concept Embeddings for Ontology Matching

Measuring the distance between ontological elements is a fundamental com...

Pushing the right boundaries matters! Wasserstein Adversarial Training for Label Noise

Noisy labels often occur in vision datasets, especially when they are is...

Learning Embeddings into Entropic Wasserstein Spaces

Euclidean embeddings of data are fundamentally limited in their ability ...

Differential Geometric Retrieval of Deep Features

Comparing images to recommend items from an image-inventory is a subject...

Finding NEEMo: Geometric Fitting using Neural Estimation of the Energy Mover's Distance

A novel neural architecture was recently developed that enforces an exac...

Bayesian Learning with Wasserstein Barycenters

In this work we introduce a novel paradigm for Bayesian learning based o...