Denoising Attention for Query-aware User Modeling in Personalized Search

08/30/2023
by   Elias Bassani, et al.
0

The personalization of search results has gained increasing attention in the past few years, thanks to the development of Neural Networks-based approaches for Information Retrieval and the importance of personalization in many search scenarios. Recent works have proposed to build user models at query time by leveraging the Attention mechanism, which allows weighing the contribution of the user-related information w.r.t. the current query. This approach allows taking into account the diversity of the user's interests by giving more importance to those related to the current search performed by the user. In this paper, we first discuss some shortcomings of the standard Attention formulation when employed for personalization. In particular, we focus on issues related to its normalization mechanism and its inability to entirely filter out noisy user-related information. Then, we introduce the Denoising Attention mechanism: an Attention variant that directly tackles the above shortcomings by adopting a robust normalization scheme and introducing a filtering mechanism. The reported experimental evaluation shows the benefits of the proposed approach over other Attention-based variants.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/20/2019

Personalizing Search Results Using Hierarchical RNN with Query-aware Attention

Search results personalization has become an effective way to improve th...
research
03/08/2023

Query-Utterance Attention with Joint modeling for Query-Focused Meeting Summarization

Query-focused meeting summarization (QFMS) aims to generate summaries fr...
research
09/15/2017

Query-based Attention CNN for Text Similarity Map

In this paper, we introduce Query-based Attention CNN(QACNN) for Text Si...
research
05/08/2018

Attention-based Hierarchical Neural Query Suggestion

Query suggestions help users of a search engine to refine their queries....
research
10/18/2021

Compositional Attention: Disentangling Search and Retrieval

Multi-head, key-value attention is the backbone of the widely successful...
research
04/27/2022

Attention Mechanism in Neural Networks: Where it Comes and Where it Goes

A long time ago in the machine learning literature, the idea of incorpor...
research
12/18/2018

Supervised Domain Enablement Attention for Personalized Domain Classification

In large-scale domain classification for natural language understanding,...

Please sign up or login with your details

Forgot password? Click here to reset