Character-level Intra Attention Network for Natural Language Inference

07/24/2017
by   Han Yang, et al.
0

Natural language inference (NLI) is a central problem in language understanding. End-to-end artificial neural networks have reached state-of-the-art performance in NLI field recently. In this paper, we propose Character-level Intra Attention Network (CIAN) for the NLI task. In our model, we use the character-level convolutional network to replace the standard word embedding layer, and we use the intra attention to capture the intra-sentence semantics. The proposed CIAN model provides improved results based on a newly published MNLI corpus.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2016

A Decomposable Attention Model for Natural Language Inference

We propose a simple neural architecture for natural language inference. ...
research
08/04/2017

Recurrent Neural Network-Based Sentence Encoder with Gated Attention for Natural Language Inference

The RepEval 2017 Shared Task aims to evaluate natural language understan...
research
05/21/2018

Character-based Neural Networks for Sentence Pair Modeling

Sentence pair modeling is critical for many NLP tasks, such as paraphras...
research
06/04/2020

Affective Conditioning on Hierarchical Networks applied to Depression Detection from Transcribed Clinical Interviews

In this work we propose a machine learning model for depression detectio...
research
09/02/2016

Skipping Word: A Character-Sequential Representation based Framework for Question Answering

Recent works using artificial neural networks based on word distributed ...
research
10/14/2022

One Graph to Rule them All: Using NLP and Graph Neural Networks to analyse Tolkien's Legendarium

Natural Language Processing and Machine Learning have considerably advan...
research
09/19/2017

Neural Networks for Text Correction and Completion in Keyboard Decoding

Despite the ubiquity of mobile and wearable text messaging applications,...

Please sign up or login with your details

Forgot password? Click here to reset