Predicting the Semantic Textual Similarity with Siamese CNN and LSTM

10/24/2018
by   Elvys Linhares Pontes, et al.
0

Semantic Textual Similarity (STS) is the basis of many applications in Natural Language Processing (NLP). Our system combines convolution and recurrent neural networks to measure the semantic similarity of sentences. It uses a convolution network to take account of the local context of words and an LSTM to consider the global context of sentences. This combination of networks helps to preserve the relevant information of sentences and improves the calculation of the similarity between sentences. Our model has achieved good results and is competitive with the best state-of-the-art systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/30/2019

Text Steganalysis with Attentional LSTM-CNN

With the rapid development of Natural Language Processing (NLP) technolo...
research
04/30/2019

Model Comparison for Semantic Grouping

We introduce a probabilistic framework for quantifying the semantic simi...
research
05/24/2023

CSTS: Conditional Semantic Textual Similarity

Semantic textual similarity (STS) has been a cornerstone task in NLP tha...
research
09/24/2021

Rethinking Crowd Sourcing for Semantic Similarity

Estimation of semantic similarity is crucial for a variety of natural la...
research
03/19/2017

Métodos de Otimização Combinatória Aplicados ao Problema de Compressão MultiFrases

The Internet has led to a dramatic increase in the amount of available i...
research
07/23/2023

Transformer-based Joint Source Channel Coding for Textual Semantic Communication

The Space-Air-Ground-Sea integrated network calls for more robust and se...
research
04/24/2023

Topological properties and organizing principles of semantic networks

Interpreting natural language is an increasingly important task in compu...

Please sign up or login with your details

Forgot password? Click here to reset