Dependency-based Convolutional Neural Networks for Sentence Embedding

07/07/2015
by   Mingbo Ma, et al.
0

In sentence modeling and classification, convolutional neural network approaches have recently achieved state-of-the-art results, but all such efforts process word vectors sequentially and neglect long-distance dependencies. To exploit both deep learning and linguistic structures, we propose a tree-based convolutional neural network model which exploit various long-distance relationships between words. Our model improves the sequential baselines on all three sentiment and question classification tasks, and achieves the highest published accuracy on TREC.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/05/2015

Discriminative Neural Sentence Modeling by Tree-Based Convolution

This paper proposes a tree-based convolutional neural network (TBCNN) fo...
research
01/19/2018

What Does a TextCNN Learn?

TextCNN, the convolutional neural network for text, is a useful deep lea...
research
06/05/2017

Deep learning for extracting protein-protein interactions from biomedical literature

State-of-the-art methods for protein-protein interaction (PPI) extractio...
research
11/20/2016

LCNN: Lookup-based Convolutional Neural Network

Porting state of the art deep learning algorithms to resource constraine...
research
06/29/2020

Switchblade – a Neural Network for Hard 2D Tasks

Convolutional neural networks have become the main tools for processing ...
research
08/25/2020

JokeMeter at SemEval-2020 Task 7: Convolutional humor

This paper describes our system that was designed for Humor evaluation w...
research
09/06/2017

Phylogenetic Convolutional Neural Networks in Metagenomics

Background: Convolutional Neural Networks can be effectively used only w...

Please sign up or login with your details

Forgot password? Click here to reset