Semantic Representation and Inference for NLP

06/15/2021
by   Dongsheng Wang, et al.
0

Semantic representation and inference is essential for Natural Language Processing (NLP). The state of the art for semantic representation and inference is deep learning, and particularly Recurrent Neural Networks (RNNs), Convolutional Neural Networks (CNNs), and transformer Self-Attention models. This thesis investigates the use of deep learning for novel semantic representation and inference, and makes contributions in the following three areas: creating training data, improving semantic representations and extending inference learning. In terms of creating training data, we contribute the largest publicly available dataset of real-life factual claims for the purpose of automatic claim verification (MultiFC), and we present a novel inference model composed of multi-scale CNNs with different kernel sizes that learn from external sources to infer fact checking labels. In terms of improving semantic representations, we contribute a novel model that captures non-compositional semantic indicators. By definition, the meaning of a non-compositional phrase cannot be inferred from the individual meanings of its composing words (e.g., hot dog). Motivated by this, we operationalize the compositionality of a phrase contextually by enriching the phrase representation with external word embeddings and knowledge graphs. Finally, in terms of inference learning, we propose a series of novel deep learning architectures that improve inference by using syntactic dependencies, by ensembling role guided attention heads, incorporating gating layers, and concatenating multiple heads in novel and effective ways. This thesis consists of seven publications (five published and two under review).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/20/2019

Contextual Compositionality Detection with External Knowledge Bases andWord Embeddings

When the meaning of a phrase cannot be inferred from the individual mean...
research
10/13/2017

Learning Phrase Embeddings from Paraphrases with GRUs

Learning phrase representations has been widely explored in many Natural...
research
04/19/2019

ERNIE: Enhanced Representation through Knowledge Integration

We present a novel language representation model enhanced by knowledge c...
research
04/17/2016

From Incremental Meaning to Semantic Unit (phrase by phrase)

This paper describes an experimental approach to Detection of Minimal Se...
research
06/01/2019

How to best use Syntax in Semantic Role Labelling

There are many different ways in which external information might be use...
research
03/19/2016

Adaptive Joint Learning of Compositional and Non-Compositional Phrase Embeddings

We present a novel method for jointly learning compositional and non-com...
research
11/02/2020

Sequence-to-Sequence Networks Learn the Meaning of Reflexive Anaphora

Reflexive anaphora present a challenge for semantic interpretation: thei...

Please sign up or login with your details

Forgot password? Click here to reset