Modeling Document Interactions for Learning to Rank with Regularized Self-Attention

05/08/2020
by   Shuo Sun, et al.
0

Learning to rank is an important task that has been successfully deployed in many real-world information retrieval systems. Most existing methods compute relevance judgments of documents independently, without holistically considering the entire set of competing documents. In this paper, we explore modeling documents interactions with self-attention based neural networks. Although self-attention networks have achieved state-of-the-art results in many NLP tasks, we find empirically that self-attention provides little benefit over baseline neural learning to rank architecture. To improve the learning of self-attention weights, We propose simple yet effective regularization terms designed to model interactions between documents. Evaluations on publicly available Learning to Rank (LETOR) datasets show that training self-attention network with our proposed regularization terms can significantly outperform existing learning to rank methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/12/2019

SetRank: Learning a Permutation-Invariant Ranking Model for Information Retrieval

In learning-to-rank for information retrieval, a ranking model is automa...
research
02/11/2020

Feature Importance Estimation with Self-Attention Networks

Black-box neural network models are widely used in industry and science,...
research
05/20/2020

Context-Aware Learning to Rank with Self-Attention

In learning to rank, one is interested in optimising the global ordering...
research
10/21/2019

Self-Attentive Document Interaction Networks for Permutation Equivariant Ranking

How to leverage cross-document interactions to improve ranking performan...
research
09/15/2022

Towards self-attention based visual navigation in the real world

Vision guided navigation requires processing complex visual information ...
research
07/23/2021

SALADnet: Self-Attentive multisource Localization in the Ambisonics Domain

In this work, we propose a novel self-attention based neural network for...
research
05/31/2023

Primal-Attention: Self-attention through Asymmetric Kernel SVD in Primal Representation

Recently, a new line of works has emerged to understand and improve self...

Please sign up or login with your details

Forgot password? Click here to reset