Graph Enhanced BERT for Query Understanding

04/03/2022
by   Juanhui Li, et al.
0

Query understanding plays a key role in exploring users' search intents and facilitating users to locate their most desired information. However, it is inherently challenging since it needs to capture semantic information from short and ambiguous queries and often requires massive task-specific labeled data. In recent years, pre-trained language models (PLMs) have advanced various natural language processing tasks because they can extract general semantic information from large-scale corpora. Therefore, there are unprecedented opportunities to adopt PLMs for query understanding. However, there is a gap between the goal of query understanding and existing pre-training strategies – the goal of query understanding is to boost search performance while existing strategies rarely consider this goal. Thus, directly applying them to query understanding is sub-optimal. On the other hand, search logs contain user clicks between queries and urls that provide rich users' search behavioral information on queries beyond their content. Therefore, in this paper, we aim to fill this gap by exploring search logs. In particular, to incorporate search logs into pre-training, we first construct a query graph where nodes are queries and two queries are connected if they lead to clicks on the same urls. Then we propose a novel graph-enhanced pre-training framework, GE-BERT, which can leverage both query content and the query graph. In other words, GE-BERT can capture both the semantic information and the users' search behavioral information of queries. Extensive experiments on various query understanding tasks have demonstrated the effectiveness of the proposed framework.

READ FULL TEXT
research
10/08/2022

Short Text Pre-training with Extended Token Classification for E-commerce Query Understanding

E-commerce query understanding is the process of inferring the shopping ...
research
04/25/2021

AdsGNN: Behavior-Graph Augmented Relevance Modeling in Sponsored Search

Sponsored search ads appear next to search results when people look for ...
research
10/06/2020

Incorporating Behavioral Hypotheses for Query Generation

Generative neural networks have been shown effective on query suggestion...
research
07/29/2019

ERNIE 2.0: A Continual Pre-training Framework for Language Understanding

Recently, pre-trained models have achieved state-of-the-art results in v...
research
01/22/2021

Query Abandonment Prediction with Recurrent Neural Models of Mouse Cursor Movements

Most successful search queries do not result in a click if the user can ...
research
04/04/2020

Towards Query Logs for Privacy Studies: On Deriving Search Queries from Questions

Translating verbose information needs into crisp search queries is a phe...
research
05/22/2023

ConQueR: Contextualized Query Reduction using Search Logs

Query reformulation is a key mechanism to alleviate the linguistic chasm...

Please sign up or login with your details

Forgot password? Click here to reset