Pre-Training Transformers for Domain Adaptation

12/18/2021
by   Burhan Ul Tayyab, et al.
0

The Visual Domain Adaptation Challenge 2021 called for unsupervised domain adaptation methods that could improve the performance of models by transferring the knowledge obtained from source datasets to out-of-distribution target datasets. In this paper, we utilize BeiT [1] and demonstrate its capability of capturing key attributes from source datasets and apply it to target datasets in a semi-supervised manner. Our method was able to outperform current state-of-the-art (SoTA) techniques and was able to achieve 1st place on the ViSDA Domain Adaptation Challenge with ACC of 56.29

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/26/2022

1st Place Solution to NeurIPS 2022 Challenge on Visual Domain Adaptation

The Visual Domain Adaptation(VisDA) 2022 Challenge calls for an unsuperv...
research
10/27/2021

2nd Place Solution for VisDA 2021 Challenge – Universally Domain Adaptive Image Recognition

The Visual Domain Adaptation (VisDA) 2021 Challenge calls for unsupervis...
research
10/14/2020

Adaptive-Attentive Geolocalization from few queries: a hybrid approach

We address the task of cross-domain visual place recognition, where the ...
research
03/03/2020

Hybrid Generative-Retrieval Transformers for Dialogue Domain Adaptation

Domain adaptation has recently become a key problem in dialogue systems ...
research
04/27/2021

Adapting ImageNet-scale models to complex distribution shifts with self-learning

While self-learning methods are an important component in many recent do...
research
03/26/2015

Towards Learning free Naive Bayes Nearest Neighbor-based Domain Adaptation

As of today, object categorization algorithms are not able to achieve th...
research
07/23/2021

VisDA-2021 Competition Universal Domain Adaptation to Improve Performance on Out-of-Distribution Data

Progress in machine learning is typically measured by training and testi...

Please sign up or login with your details

Forgot password? Click here to reset