Online Representation Learning in Recurrent Neural Language Models

08/16/2015
by   Marek Rei, et al.
0

We investigate an extension of continuous online learning in recurrent neural network language models. The model keeps a separate vector representation of the current unit of text being processed and adaptively adjusts it after each prediction. The initial experiments give promising results, indicating that the method is able to increase language modelling accuracy, while also decreasing the parameters needed to store the model along with the computation required at each step.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset