Structure-Aware Language Model Pretraining Improves Dense Retrieval on Structured Data

05/31/2023
by   Xinze Li, et al.
0

This paper presents Structure Aware Dense Retrieval (SANTA) model, which encodes user queries and structured data in one universal embedding space for retrieving structured data. SANTA proposes two pretraining methods to make language models structure-aware and learn effective representations for structured data: 1) Structured Data Alignment, which utilizes the natural alignment relations between structured data and unstructured data for structure-aware pretraining. It contrastively trains language models to represent multi-modal text data and teaches models to distinguish matched structured data for unstructured texts. 2) Masked Entity Prediction, which designs an entity-oriented mask strategy and asks language models to fill in the masked entities. Our experiments show that SANTA achieves state-of-the-art on code search and product search and conducts convincing results in the zero-shot setting. SANTA learns tailored representations for multi-modal text data by aligning structured and unstructured data pairs and capturing structural semantics by masking and predicting entities in the structured data. All codes are available at https://github.com/OpenMatch/OpenMatch.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/25/2023

Unstructured and structured data: Can we have the best of both worlds with large language models?

This paper presents an opinion on the potential of using large language ...
research
06/01/2021

Implicit Representations of Meaning in Neural Language Models

Does the effectiveness of neural language models derive entirely from ac...
research
08/25/2022

A Compact Pretraining Approach for Neural Language Models

Domain adaptation for large neural language models (NLMs) is coupled wit...
research
04/13/2021

Multi-Step Reasoning Over Unstructured Text with Beam Dense Retrieval

Complex question answering often requires finding a reasoning chain that...
research
06/07/2022

Unsupervised Context Aware Sentence Representation Pretraining for Multi-lingual Dense Retrieval

Recent research demonstrates the effectiveness of using pretrained langu...
research
11/03/2020

CMT in TREC-COVID Round 2: Mitigating the Generalization Gaps from Web to Special Domain Search

Neural rankers based on deep pretrained language models (LMs) have been ...
research
08/06/2023

Embedding-based Retrieval with LLM for Effective Agriculture Information Extracting from Unstructured Data

Pest identification is a crucial aspect of pest control in agriculture. ...

Please sign up or login with your details

Forgot password? Click here to reset