Conditional Bures Metric for Domain Adaptation

07/31/2021
by   You-Wei Luo, et al.
0

As a vital problem in classification-oriented transfer, unsupervised domain adaptation (UDA) has attracted widespread attention in recent years. Previous UDA methods assume the marginal distributions of different domains are shifted while ignoring the discriminant information in the label distributions. This leads to classification performance degeneration in real applications. In this work, we focus on the conditional distribution shift problem which is of great concern to current conditional invariant models. We aim to seek a kernel covariance embedding for conditional distribution which remains yet unexplored. Theoretically, we propose the Conditional Kernel Bures (CKB) metric for characterizing conditional distribution discrepancy, and derive an empirical estimation for the CKB metric without introducing the implicit kernel feature map. It provides an interpretable approach to understand the knowledge transfer mechanism. The established consistency theory of the empirical estimation provides a theoretical guarantee for convergence. A conditional distribution matching network is proposed to learn the conditional invariant and discriminative features for UDA. Extensive experiments and analysis show the superiority of our proposed model.

READ FULL TEXT
research
03/07/2022

Maximizing Conditional Independence for Unsupervised Domain Adaptation

Unsupervised domain adaptation studies how to transfer a learner from a ...
research
02/26/2022

Generalized Label Shift Correction via Minimum Uncertainty Principle: Theory and Algorithm

As a fundamental problem in machine learning, dataset shift induces a pa...
research
01/25/2021

A Unified Joint Maximum Mean Discrepancy for Domain Adaptation

Domain adaptation has received a lot of attention in recent years, and m...
research
08/24/2020

Learning Kernel for Conditional Moment-Matching Discrepancy-based Image Classification

Conditional Maximum Mean Discrepancy (CMMD) can capture the discrepancy ...
research
07/14/2022

Improved OOD Generalization via Conditional Invariant Regularizer

Recently, generalization on out-of-distribution (OOD) data with correlat...
research
11/15/2020

DIRL: Domain-Invariant Representation Learning for Sim-to-Real Transfer

Generating large-scale synthetic data in simulation is a feasible altern...
research
11/16/2017

Robust Unsupervised Domain Adaptation for Neural Networks via Moment Alignment

A novel approach for unsupervised domain adaptation for neural networks ...

Please sign up or login with your details

Forgot password? Click here to reset