When FastText Pays Attention: Efficient Estimation of Word Representations using Constrained Positional Weighting

04/19/2021
by   Vít Novotný, et al.
0

Since the seminal work of Mikolov et al. (2013a) and Bojanowski et al. (2017), word representations of shallow log-bilinear language models have found their way into many NLP applications. Mikolov et al. (2018) introduced a positional log-bilinear language model, which has characteristics of an attention-based language model and which has reached state-of-the-art performance on the intrinsic word analogy task. However, the positional model has never been evaluated on qualitative criteria or extrinsic tasks and its speed is impractical. We outline the similarities between the attention mechanism and the positional model, and we propose a constrained positional model, which adapts the sparse attention mechanism of Dai et al. (2018). We evaluate the positional and constrained positional models on three novel qualitative criteria and on the extrinsic language modeling task of Botha and Blunsom (2014). We show that the positional and constrained positional models contain interpretable information about word order and outperform the subword model of Bojanowski et al. (2017) on language modeling. We also show that the constrained positional model outperforms the positional model on language modeling and is twice as fast.

READ FULL TEXT
research
07/28/2019

CAiRE: An End-to-End Empathetic Chatbot

In this paper, we present an end-to-end empathetic conversation agent CA...
research
12/11/2019

FlauBERT: Unsupervised Language Model Pre-training for French

Language models have become a key step to achieve state-of-the-art resul...
research
09/06/2017

A Neural Language Model for Dynamically Representing the Meanings of Unknown Words and Entities in a Discourse

This study addresses the problem of identifying the meaning of unknown w...
research
08/27/2018

Pyramidal Recurrent Unit for Language Modeling

LSTMs are powerful tools for modeling contextual information, as evidenc...
research
09/20/2023

Safurai 001: New Qualitative Approach for Code LLM Evaluation

This paper presents Safurai-001, a new Large Language Model (LLM) with s...
research
07/10/2018

Revisiting the Hierarchical Multiscale LSTM

Hierarchical Multiscale LSTM (Chung et al., 2016a) is a state-of-the-art...
research
04/05/2019

Alternative Weighting Schemes for ELMo Embeddings

ELMo embeddings (Peters et. al, 2018) had a huge impact on the NLP commu...

Please sign up or login with your details

Forgot password? Click here to reset