Simple Attention-Based Representation Learning for Ranking Short Social Media Posts
This paper explores the problem of ranking short social media posts with respect to user queries using neural networks. Instead of starting with a complex architecture, we proceed from the bottom up and examine the effectiveness of a simple, word-level Siamese architecture augmented with attention-based mechanisms for capturing semantic soft matches between query and post terms. Extensive experiments on datasets from the TREC Microblog Tracks show that our simple models not only demonstrate better effectiveness than existing approaches that are far more complex or exploit a more diverse set of relevance signals, but also achieve 4 times speedup in model training and inference.
READ FULL TEXT