Tag Recommendation by Word-Level Tag Sequence Modeling

11/30/2019
by   Xuewen Shi, et al.
0

In this paper, we transform tag recommendation into a word-based text generation problem and introduce a sequence-to-sequence model. The model inherits the advantages of LSTM-based encoder for sequential modeling and attention-based decoder with local positional encodings for learning relations globally. Experimental results on Zhihu datasets illustrate the proposed model outperforms other state-of-the-art text classification based methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/05/2018

Query and Output: Generating Words by Querying Distributed Word Representations for Paraphrase Generation

Most recent approaches use the sequence-to-sequence model for paraphrase...
research
01/30/2023

SSR-TA: Sequence to Sequence based expert recurrent recommendation for ticket automation

The ticket automation provides crucial support for the normal operation ...
research
03/05/2018

Word Embedding Attention Network: Generating Words by Querying Distributed Word Representations for Paraphrase Generation

Most recent approaches use the sequence-to-sequence model for paraphrase...
research
08/31/2019

Modeling Graph Structure in Transformer for Better AMR-to-Text Generation

Recent studies on AMR-to-text generation often formalize the task as a s...
research
02/21/2020

Guider l'attention dans les modeles de sequence a sequence pour la prediction des actes de dialogue

The task of predicting dialog acts (DA) based on conversational dialog i...
research
04/29/2018

A Tree Search Algorithm for Sequence Labeling

In this paper we propose a novel reinforcement learning based model for ...
research
04/29/2018

Sequence Tagging with Policy-Value Networks and Tree Search

In this paper we propose a novel reinforcement learning based model for ...

Please sign up or login with your details

Forgot password? Click here to reset