Label Calibration for Semantic Segmentation Under Domain Shift

07/20/2023
by   Ondrej Bohdal, et al.
0

Performance of a pre-trained semantic segmentation model is likely to substantially decrease on data from a new domain. We show a pre-trained model can be adapted to unlabelled target domain data by calculating soft-label prototypes under the domain shift and making predictions according to the prototype closest to the vector with predicted class probabilities. The proposed adaptation procedure is fast, comes almost for free in terms of computational resources and leads to considerable performance improvements. We demonstrate the benefits of such label calibration on the highly-practical synthetic-to-real semantic segmentation problem.

READ FULL TEXT
research
10/22/2017

Rethinking Convolutional Semantic Segmentation Learning

Deep convolutional semantic segmentation (DCSS) learning doesn't converg...
research
12/22/2022

On Calibrating Semantic Segmentation Models: Analysis and An Algorithm

We study the problem of semantic segmentation calibration. For image cla...
research
08/12/2020

Local Temperature Scaling for Probability Calibration

For semantic segmentation, label probabilities are often uncalibrated as...
research
08/06/2023

Cal-SFDA: Source-Free Domain-adaptive Semantic Segmentation with Differentiable Expected Calibration Error

The prevalence of domain adaptive semantic segmentation has prompted con...
research
02/27/2023

Soft labelling for semantic segmentation: Bringing coherence to label down-sampling

In semantic segmentation, training data down-sampling is commonly perfor...
research
12/21/2021

Distribution-aware Margin Calibration for Semantic Segmentation in Images

The Jaccard index, also known as Intersection-over-Union (IoU), is one o...

Please sign up or login with your details

Forgot password? Click here to reset