DeepAI AI Chat
Log In Sign Up

Conditional Bures Metric for Domain Adaptation

by   You-Wei Luo, et al.

As a vital problem in classification-oriented transfer, unsupervised domain adaptation (UDA) has attracted widespread attention in recent years. Previous UDA methods assume the marginal distributions of different domains are shifted while ignoring the discriminant information in the label distributions. This leads to classification performance degeneration in real applications. In this work, we focus on the conditional distribution shift problem which is of great concern to current conditional invariant models. We aim to seek a kernel covariance embedding for conditional distribution which remains yet unexplored. Theoretically, we propose the Conditional Kernel Bures (CKB) metric for characterizing conditional distribution discrepancy, and derive an empirical estimation for the CKB metric without introducing the implicit kernel feature map. It provides an interpretable approach to understand the knowledge transfer mechanism. The established consistency theory of the empirical estimation provides a theoretical guarantee for convergence. A conditional distribution matching network is proposed to learn the conditional invariant and discriminative features for UDA. Extensive experiments and analysis show the superiority of our proposed model.


Maximizing Conditional Independence for Unsupervised Domain Adaptation

Unsupervised domain adaptation studies how to transfer a learner from a ...

Generalized Label Shift Correction via Minimum Uncertainty Principle: Theory and Algorithm

As a fundamental problem in machine learning, dataset shift induces a pa...

A Unified Joint Maximum Mean Discrepancy for Domain Adaptation

Domain adaptation has received a lot of attention in recent years, and m...

Learning Kernel for Conditional Moment-Matching Discrepancy-based Image Classification

Conditional Maximum Mean Discrepancy (CMMD) can capture the discrepancy ...

Improved OOD Generalization via Conditional Invariant Regularizer

Recently, generalization on out-of-distribution (OOD) data with correlat...

DIRL: Domain-Invariant Representation Learning for Sim-to-Real Transfer

Generating large-scale synthetic data in simulation is a feasible altern...

Robust Unsupervised Domain Adaptation for Neural Networks via Moment Alignment

A novel approach for unsupervised domain adaptation for neural networks ...