Slimmable Domain Adaptation

by   Rang Meng, et al.

Vanilla unsupervised domain adaptation methods tend to optimize the model with fixed neural architecture, which is not very practical in real-world scenarios since the target data is usually processed by different resource-limited devices. It is therefore of great necessity to facilitate architecture adaptation across various devices. In this paper, we introduce a simple framework, Slimmable Domain Adaptation, to improve cross-domain generalization with a weight-sharing model bank, from which models of different capacities can be sampled to accommodate different accuracy-efficiency trade-offs. The main challenge in this framework lies in simultaneously boosting the adaptation performance of numerous models in the model bank. To tackle this problem, we develop a Stochastic EnsEmble Distillation method to fully exploit the complementary knowledge in the model bank for inter-model interaction. Nevertheless, considering the optimization conflict between inter-model interaction and intra-model adaptation, we augment the existing bi-classifier domain confusion architecture into an Optimization-Separated Tri-Classifier counterpart. After optimizing the model bank, architecture adaptation is leveraged via our proposed Unsupervised Performance Evaluation Metric. Under various resource constraints, our framework surpasses other competing approaches by a very large margin on multiple benchmarks. It is also worth emphasizing that our framework can preserve the performance improvement against the source-only model even when the computing complexity is reduced to 1/64. Code will be available at


1st Place Solution to NeurIPS 2022 Challenge on Visual Domain Adaptation

The Visual Domain Adaptation(VisDA) 2022 Challenge calls for an unsuperv...

Bi-Classifier Determinacy Maximization for Unsupervised Domain Adaptation

Unsupervised domain adaptation challenges the problem of transferring kn...

M-ADDA: Unsupervised Domain Adaptation with Deep Metric Learning

Unsupervised domain adaptation techniques have been successful for a wid...

Back-Training excels Self-Training at Unsupervised Domain Adaptation of Question Generation and Passage Retrieval

In this paper, we propose a new domain adaptation method called back-tra...

Unsupervised Video Domain Adaptation: A Disentanglement Perspective

Unsupervised video domain adaptation is a practical yet challenging task...

Improving Unsupervised Domain Adaptation by Reducing Bi-level Feature Redundancy

Reducing feature redundancy has shown beneficial effects for improving t...

OpenGDA: Graph Domain Adaptation Benchmark for Cross-network Learning

Graph domain adaptation models are widely adopted in cross-network learn...

Please sign up or login with your details

Forgot password? Click here to reset