A Token-wise CNN-based Method for Sentence Compression

09/23/2020
by   Weiwei Hou, et al.
0

Sentence compression is a Natural Language Processing (NLP) task aimed at shortening original sentences and preserving their key information. Its applications can benefit many fields e.g. one can build tools for language education. However, current methods are largely based on Recurrent Neural Network (RNN) models which suffer from poor processing speed. To address this issue, in this paper, we propose a token-wise Convolutional Neural Network, a CNN-based model along with pre-trained Bidirectional Encoder Representations from Transformers (BERT) features for deletion-based sentence compression. We also compare our model with RNN-based models and fine-tuned BERT. Although one of the RNN-based models outperforms marginally other models given the same input, our CNN-based model was ten times faster than the RNN-based approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/26/2021

Evaluation of BERT and ALBERT Sentence Embedding Performance on Downstream NLP Tasks

Contextualized representations from a pre-trained language model are cen...
research
05/16/2019

TraceWalk: Semantic-based Process Graph Embedding for Consistency Checking

Process consistency checking (PCC), an interdiscipline of natural langua...
research
08/05/2015

Relation Classification via Recurrent Neural Network

Deep learning has gained much success in sentence-level relation classif...
research
07/13/2023

Convolutional Neural Networks for Sentiment Analysis on Weibo Data: A Natural Language Processing Approach

This study addressed the complex task of sentiment analysis on a dataset...
research
04/27/2020

ColBERT: Using BERT Sentence Embedding for Humor Detection

Automatic humor detection has interesting use cases in modern technologi...
research
09/11/2019

How Does BERT Answer Questions? A Layer-Wise Analysis of Transformer Representations

Bidirectional Encoder Representations from Transformers (BERT) reach sta...
research
11/12/2020

Context-aware Stand-alone Neural Spelling Correction

Existing natural language processing systems are vulnerable to noisy inp...

Please sign up or login with your details

Forgot password? Click here to reset