Model Compression for Domain Adaptation through Causal Effect Estimation

01/18/2021
by   Guy Rotman, et al.
0

Recent improvements in the predictive quality of natural language processing systems are often dependent on a substantial increase in the number of model parameters. This has led to various attempts of compressing such models, but existing methods have not considered the differences in the predictive power of various model components or in the generalizability of the compressed models. To understand the connection between model compression and out-of-distribution generalization, we define the task of compressing language representation models such that they perform best in a domain adaptation setting. We choose to address this problem from a causal perspective, attempting to estimate the average treatment effect (ATE) of a model component, such as a single layer, on the model's predictions. Our proposed ATE-guided Model Compression scheme (AMoC), generates many model candidates, differing by the model components that were removed. Then, we select the best candidate through a stepwise regression model that utilizes the ATE to predict the expected performance on the target domain. AMoC outperforms strong baselines on 46 of 60 domain pairs across two text classification tasks, with an average improvement of more than 3% in F1 above the strongest baseline.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/11/2021

Selecting Treatment Effects Models for Domain Adaptation Using Causal Knowledge

Selecting causal inference models for estimating individualized treatmen...
research
03/16/2020

A Label Proportions Estimation technique for Adversarial Domain Adaptation in Text Classification

Many text classification tasks are domain-dependent, and various domain ...
research
05/22/2020

Towards Open Domain Event Trigger Identification using Adversarial Domain Adaptation

We tackle the task of building supervised event trigger identification m...
research
08/04/2023

Meta-Tsallis-Entropy Minimization: A New Self-Training Approach for Domain Adaptation on Text Classification

Text classification is a fundamental task for natural language processin...
research
11/22/2021

DAPPER: Performance Estimation of Domain Adaptation in Mobile Sensing

Many applications that utilize sensors in mobile devices and apply machi...
research
06/08/2018

#SarcasmDetection is soooo general! Towards a Domain-Independent Approach for Detecting Sarcasm

Automatic sarcasm detection methods have traditionally been designed for...
research
11/03/2020

Learning Causal Semantic Representation for Out-of-Distribution Prediction

Conventional supervised learning methods, especially deep ones, are foun...

Please sign up or login with your details

Forgot password? Click here to reset