Understanding BERT Rankers Under Distillation

07/21/2020
by   Luyu Gao, et al.
0

Deep language models such as BERT pre-trained on large corpus have given a huge performance boost to the state-of-the-art information retrieval ranking systems. Knowledge embedded in such models allows them to pick up complex matching signals between passages and queries. However, the high computation cost during inference limits their deployment in real-world search scenarios. In this paper, we study if and how the knowledge for search within BERT can be transferred to a smaller ranker through distillation. Our experiments demonstrate that it is crucial to use a proper distillation procedure, which produces up to nine times speedup while preserving the state-of-the-art performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/02/2022

Entity-aware Transformers for Entity Search

Pre-trained language models such as BERT have been a key ingredient to a...
research
10/27/2022

QUILL: Query Intent with Large Language Models using Retrieval Augmentation and Multi-stage Distillation

Large Language Models (LLMs) have shown impressive results on a variety ...
research
04/07/2020

Towards Non-task-specific Distillation of BERT via Sentence Representation Approximation

Recently, BERT has become an essential ingredient of various NLP deep mo...
research
04/28/2022

RobBERTje: a Distilled Dutch BERT Model

Pre-trained large-scale language models such as BERT have gained a lot o...
research
03/19/2021

Cost-effective Deployment of BERT Models in Serverless Environment

In this study we demonstrate the viability of deploying BERT-style model...
research
06/04/2023

I^3 Retriever: Incorporating Implicit Interaction in Pre-trained Language Models for Passage Retrieval

Passage retrieval is a fundamental task in many information systems, suc...
research
09/15/2021

EfficientBERT: Progressively Searching Multilayer Perceptron via Warm-up Knowledge Distillation

Pre-trained language models have shown remarkable results on various NLP...

Please sign up or login with your details

Forgot password? Click here to reset