Towards Open-Text Semantic Parsing via Multi-Task Learning of Structured Embeddings

07/19/2011
by   Antoine Bordes, et al.
0

Open-text (or open-domain) semantic parsers are designed to interpret any statement in natural language by inferring a corresponding meaning representation (MR). Unfortunately, large scale systems cannot be easily machine-learned due to lack of directly supervised data. We propose here a method that learns to assign MRs to a wide range of text (using a dictionary of more than 70,000 words, which are mapped to more than 40,000 entities) thanks to a training scheme that combines learning from WordNet and ConceptNet with learning from raw text. The model learns structured embeddings of words, entities and MRs via a multi-task training process operating on these diverse sources of data that integrates all the learnt knowledge into a single system. This work ends up combining methods for knowledge acquisition, semantic parsing, and word-sense disambiguation. Experiments on various tasks indicate that our approach is indeed successful and can form a basis for future more sophisticated systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/21/2020

Open-Domain Frame Semantic Parsing Using Transformers

Frame semantic parsing is a complex problem which includes multiple unde...
research
06/08/2021

One Semantic Parser to Parse Them All: Sequence to Sequence Multi-Task Learning on Semantic Parsing Datasets

Semantic parsers map natural language utterances to meaning representati...
research
06/14/2017

Transfer Learning for Neural Semantic Parsing

The goal of semantic parsing is to map natural language to a machine int...
research
06/27/2019

Compositional Semantic Parsing Across Graphbanks

Most semantic parsers that map sentences to graph-based meaning represen...
research
06/04/2019

Multi-Task Semantic Dependency Parsing with Policy Gradient for Learning Easy-First Strategies

In Semantic Dependency Parsing (SDP), semantic relations form directed a...
research
10/21/2019

On Semi-Supervised Multiple Representation Behavior Learning

We propose a novel paradigm of semi-supervised learning (SSL)–the semi-s...
research
10/01/2015

A Generative Model of Words and Relationships from Multiple Sources

Neural language models are a powerful tool to embed words into semantic ...

Please sign up or login with your details

Forgot password? Click here to reset