Concept-Based Embeddings for Natural Language Processing

07/15/2018
by   Yukun Ma, et al.
0

In this work, we focus on effectively leveraging and integrating information from concept-level as well as word-level via projecting concepts and words into a lower dimensional space while retaining most critical semantics. In a broad context of opinion understanding system, we investigate the use of the fused embedding for several core NLP tasks: named entity detection and classification, automatic speech recognition reranking, and targeted sentiment analysis.

READ FULL TEXT

page 17

page 18

research
11/25/2019

hauWE: Hausa Words Embedding for Natural Language Processing

Words embedding (distributed word vector representations) have become an...
research
06/24/2021

Where are we in semantic concept extraction for Spoken Language Understanding?

Spoken language understanding (SLU) topic has seen a lot of progress the...
research
11/18/2018

Quantifying Uncertainties in Natural Language Processing Tasks

Reliable uncertainty quantification is a first step towards building exp...
research
08/16/2018

Computing Word Classes Using Spectral Clustering

Clustering a lexicon of words is a well-studied problem in natural langu...
research
08/30/2023

HAlf-MAsked Model for Named Entity Sentiment analysis

Named Entity Sentiment analysis (NESA) is one of the most actively devel...
research
10/15/2021

Span Detection for Aspect-Based Sentiment Analysis in Vietnamese

Aspect-based sentiment analysis plays an essential role in natural langu...
research
04/15/2016

Parallelizing Word2Vec in Shared and Distributed Memory

Word2Vec is a widely used algorithm for extracting low-dimensional vecto...

Please sign up or login with your details

Forgot password? Click here to reset