A Fixed-Size Encoding Method for Variable-Length Sequences with its Application to Neural Network Language Models

05/06/2015
by   Shiliang Zhang, et al.
0

In this paper, we propose the new fixed-size ordinally-forgetting encoding (FOFE) method, which can almost uniquely encode any variable-length sequence of words into a fixed-size representation. FOFE can model the word order in a sequence using a simple ordinally-forgetting mechanism according to the positions of words. In this work, we have applied FOFE to feedforward neural network language models (FNN-LMs). Experimental results have shown that without using any recurrent feedbacks, FOFE based FNN-LMs can significantly outperform not only the standard fixed-input FNN-LMs but also the popular RNN-LMs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/23/2019

Fixed-Size Ordinally Forgetting Encoding Based Word Sense Disambiguation

In this paper, we present our method of using fixed-size ordinally forge...
research
05/21/2021

Nori: Concealing the Concealed Identifier in 5G

IMSI catchers have been a long standing and serious privacy problem in p...
research
07/02/2018

Neural Random Projections for Language Modelling

Neural network-based language models deal with data sparsity problems by...
research
07/09/2019

Neural or Statistical: An Empirical Study on Language Models for Chinese Input Recommendation on Mobile

Chinese input recommendation plays an important role in alleviating huma...
research
06/05/2018

Information Aggregation via Dynamic Routing for Sequence Encoding

While much progress has been made in how to encode a text sequence into ...
research
03/17/2015

genCNN: A Convolutional Architecture for Word Sequence Prediction

We propose a novel convolutional architecture, named genCNN, for word se...
research
07/30/2019

Dual-FOFE-net Neural Models for Entity Linking with PageRank

This paper presents a simple and computationally efficient approach for ...

Please sign up or login with your details

Forgot password? Click here to reset