Bayesian Metric Learning for Uncertainty Quantification in Image Retrieval

02/02/2023
by   Frederik Warburg, et al.
4

We propose the first Bayesian encoder for metric learning. Rather than relying on neural amortization as done in prior works, we learn a distribution over the network weights with the Laplace Approximation. We actualize this by first proving that the contrastive loss is a valid log-posterior. We then propose three methods that ensure a positive definite Hessian. Lastly, we present a novel decomposition of the Generalized Gauss-Newton approximation. Empirically, we show that our Laplacian Metric Learner (LAM) estimates well-calibrated uncertainties, reliably detects out-of-distribution examples, and yields state-of-the-art predictive performance.

READ FULL TEXT

page 1

page 3

page 6

page 8

page 9

research
11/25/2020

Bayesian Triplet Loss: Uncertainty Quantification in Image Retrieval

Uncertainty quantification in image retrieval is crucial for downstream ...
research
06/20/2012

Bayesian Active Distance Metric Learning

Distance metric learning is an important component for many tasks, such ...
research
10/06/2020

Learnable Uncertainty under Laplace Approximations

Laplace approximations are classic, computationally lightweight means fo...
research
06/30/2022

Laplacian Autoencoders for Learning Stochastic Representations

Established methods for unsupervised representation learning such as var...
research
06/23/2023

Catching Image Retrieval Generalization

The concepts of overfitting and generalization are vital for evaluating ...
research
06/08/2020

Calibrated neighborhood aware confidence measure for deep metric learning

Deep metric learning has gained promising improvement in recent years fo...

Please sign up or login with your details

Forgot password? Click here to reset