Efficient Transfer Learning Schemes for Personalized Language Modeling using Recurrent Neural Network

01/13/2017
by   Seunghyun Yoon, et al.
0

In this paper, we propose an efficient transfer leaning methods for training a personalized language model using a recurrent neural network with long short-term memory architecture. With our proposed fast transfer learning schemes, a general language model is updated to a personalized language model with a small amount of user data and a limited computing resource. These methods are especially useful for a mobile device environment while the data is prevented from transferring out of the device for privacy purposes. Through experiments on dialogue data in a drama, it is verified that our transfer learning methods have successfully generated the personalized language model, whose output is more similar to the personal language style in both qualitative and quantitative aspects.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/25/2018

Personalized Language Model for Query Auto-Completion

Query auto-completion is a search engine feature whereby the system sugg...
research
09/26/2019

DARTS: Dialectal Arabic Transcription System

We present the speech to text transcription system, called DARTS, for lo...
research
01/01/2019

Transfer learning from language models to image caption generators: Better models may not transfer better

When designing a neural caption generator, a convolutional neural networ...
research
11/09/2019

On Architectures for Including Visual Information in Neural Language Models for Image Description

A neural language model can be conditioned into generating descriptions ...
research
11/11/2017

Fine Grained Knowledge Transfer for Personalized Task-oriented Dialogue Systems

Training a personalized dialogue system requires a lot of data, and the ...
research
03/05/2019

PROPS: Probabilistic personalization of black-box sequence models

We present PROPS, a lightweight transfer learning mechanism for sequenti...
research
11/06/2018

Language model integration based on memory control for sequence to sequence speech recognition

In this paper, we explore several new schemes to train a seq2seq model t...

Please sign up or login with your details

Forgot password? Click here to reset