Unsupervised Learning of Word-Sequence Representations from Scratch via Convolutional Tensor Decomposition

06/10/2016
by   Furong Huang, et al.
0

Unsupervised text embeddings extraction is crucial for text understanding in machine learning. Word2Vec and its variants have received substantial success in mapping words with similar syntactic or semantic meaning to vectors close to each other. However, extracting context-aware word-sequence embedding remains a challenging task. Training over large corpus is difficult as labels are difficult to get. More importantly, it is challenging for pre-trained models to obtain word-sequence embeddings that are universally good for all downstream tasks or for any new datasets. We propose a two-phased ConvDic+DeconvDec framework to solve the problem by combining a word-sequence dictionary learning model with a word-sequence embedding decode model. We propose a convolutional tensor decomposition mechanism to learn good word-sequence phrase dictionary in the learning phase. It is proved to be more accurate and much more efficient than the popular alternating minimization method. In the decode phase, we introduce a deconvolution framework that is immune to the problem of varying sentence lengths. The word-sequence embeddings we extracted using ConvDic+DeconvDec are universally good for a few downstream tasks we test on. The framework requires neither pre-training nor prior/outside information.

READ FULL TEXT
research
10/09/2019

Word Embedding Visualization Via Dictionary Learning

Co-occurrence statistics based word embedding techniques have proved to ...
research
07/22/2021

Multi-stage Pre-training over Simplified Multimodal Pre-training Models

Multimodal pre-training models, such as LXMERT, have achieved excellent ...
research
10/13/2021

Dict-BERT: Enhancing Language Model Pre-training with Dictionary

Pre-trained language models (PLMs) aim to learn universal language repre...
research
05/01/2020

Why Overfitting Isn't Always Bad: Retrofitting Cross-Lingual Word Embeddings to Dictionaries

Cross-lingual word embeddings (CLWE) are often evaluated on bilingual le...
research
05/23/2018

Embedding Syntax and Semantics of Prepositions via Tensor Decomposition

Prepositions are among the most frequent words in English and play compl...
research
08/04/2020

Taking Notes on the Fly Helps BERT Pre-training

How to make unsupervised language pre-training more efficient and less r...
research
05/22/2018

Learning sentence embeddings using Recursive Networks

Learning sentence vectors that generalise well is a challenging task. In...

Please sign up or login with your details

Forgot password? Click here to reset