Progressive Memory Banks for Incremental Domain Adaptation

11/01/2018
by   Nabiha Asghar, et al.
0

This paper addresses the problem of incremental domain adaptation (IDA). We assume each domain comes one after another, and that we could only access data in the current domain. The goal of IDA is to build a unified model performing well on all the domains that we have encountered. We propose to augment a recurrent neural network (RNN) with a directly parameterized memory bank, which is retrieved by an attention mechanism at each step of RNN transition. The memory bank provides a natural way of IDA: when adapting our model to a new domain, we progressively add new slots to the memory bank, which increases the number of parameters, and thus the model capacity. We learn the new memory slots and fine-tune existing parameters by back-propagation. Experimental results show that our approach achieves significantly better performance than fine-tuning alone, which suffers from the catastrophic forgetting problem. Compared with expanding hidden states, our approach is more robust for old domains, shown by both empirical and theoretical results. Our model also outperforms previous work of IDA including elastic weight consolidation (EWC) and the progressive neural network.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/02/2020

Sequential Domain Adaptation through Elastic Weight Consolidation for Sentiment Analysis

Elastic Weight Consolidation (EWC) is a technique used in overcoming cat...
research
04/15/2023

Continual Domain Adaptation through Pruning-aided Domain-specific Weight Modulation

In this paper, we propose to develop a method to address unsupervised do...
research
08/10/2022

Continual Machine Reading Comprehension via Uncertainty-aware Fixed Memory and Adversarial Domain Adaptation

Continual Machine Reading Comprehension aims to incrementally learn from...
research
12/28/2021

FRIDA – Generative Feature Replay for Incremental Domain Adaptation

We tackle the novel problem of incremental unsupervised domain adaptatio...
research
05/24/2020

SERIL: Noise Adaptive Speech Enhancement using Regularization-based Incremental Learning

Numerous noise adaptation techniques have been proposed to address the m...
research
03/03/2016

Multi-domain Neural Network Language Generation for Spoken Dialogue Systems

Moving from limited-domain natural language generation (NLG) to open dom...
research
05/11/2017

Incremental Learning Through Deep Adaptation

Given an existing trained neural network, it is often desirable to be ab...

Please sign up or login with your details

Forgot password? Click here to reset