Automated Concatenation of Embeddings for Structured Prediction

10/10/2020
by   Xinyu Wang, et al.
4

Pretrained contextualized embeddings are powerful word representations for structured prediction tasks. Recent work found that better word representations can be obtained by concatenating different types of embeddings. However, the selection of embeddings to form the best concatenated representation usually varies depending on the task and the collection of candidate embeddings, and the ever-increasing number of embedding types makes it a more difficult problem. In this paper, we propose Automated Concatenation of Embeddings (ACE) to automate the process of finding better concatenations of embeddings for structured prediction tasks, based on a formulation inspired by recent progress on neural architecture search. Specifically, a controller alternately samples a concatenation of embeddings, according to its current belief of the effectiveness of individual embedding types in consideration for a task, and updates the belief based on a reward. We follow strategies in reinforcement learning to optimize the parameters of the controller and compute the reward based on the accuracy of a task model, which is fed with the sampled concatenation as input and trained on a task dataset. Empirical results on 6 tasks and 23 datasets show that our approach outperforms strong baselines and achieves state-of-the-art performance with fine-tuned embeddings in the vast majority of evaluations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/06/2021

Enhanced Universal Dependency Parsing with Automated Concatenation of Embeddings

This paper describes the system used in submission from SHANGHAITECH tea...
research
09/28/2017

Structured Embedding Models for Grouped Data

Word embeddings are a powerful approach for analyzing language, and expo...
research
10/23/2020

Adversarial Learning of Feature-based Meta-Embeddings

Certain embedding types outperform others in different scenarios, e.g., ...
research
08/18/2020

NASE: Learning Knowledge Graph Embedding for Link Prediction via Neural Architecture Search

Link prediction is the task of predicting missing connections between en...
research
04/15/2017

MUSE: Modularizing Unsupervised Sense Embeddings

This paper proposes to address the word sense ambiguity issue in an unsu...
research
12/08/2016

Entity Identification as Multitasking

Standard approaches in entity identification hard-code boundary detectio...

Please sign up or login with your details

Forgot password? Click here to reset