Wasserstein Barycenter Model Ensembling

02/13/2019
by   Pierre Dognin, et al.
0

In this paper we propose to perform model ensembling in a multiclass or a multilabel learning setting using Wasserstein (W.) barycenters. Optimal transport metrics, such as the Wasserstein distance, allow incorporating semantic side information such as word embeddings. Using W. barycenters to find the consensus between models allows us to balance confidence and semantics in finding the agreement between the models. We show applications of Wasserstein ensembling in attribute-based classification, multilabel learning and image captioning generation. These results show that the W. ensembling is a viable alternative to the basic geometric or arithmetic mean ensembling.

READ FULL TEXT

page 15

page 17

research
08/27/2021

Automatic Text Evaluation through the Lens of Wasserstein Barycenters

A new metric to evaluate text generation based on deep contextualized e...
research
07/22/2022

Exploring Wasserstein Distance across Concept Embeddings for Ontology Matching

Measuring the distance between ontological elements is a fundamental com...
research
04/08/2019

Pushing the right boundaries matters! Wasserstein Adversarial Training for Label Noise

Noisy labels often occur in vision datasets, especially when they are is...
research
02/21/2017

Differential Geometric Retrieval of Deep Features

Comparing images to recommend items from an image-inventory is a subject...
research
05/08/2019

Learning Embeddings into Entropic Wasserstein Spaces

Euclidean embeddings of data are fundamentally limited in their ability ...
research
09/30/2022

Finding NEEMo: Geometric Fitting using Neural Estimation of the Energy Mover's Distance

A novel neural architecture was recently developed that enforces an exac...
research
08/25/2020

Sequences of well-distributed vertices on graphs and spectral bounds on optimal transport

Given a graph G=(V,E), suppose we are interested in selecting a sequence...

Please sign up or login with your details

Forgot password? Click here to reset