Simplified TinyBERT: Knowledge Distillation for Document Retrieval

09/16/2020
by   Xuanang Chen, et al.
0

Despite the effectiveness of utilizing BERT for document ranking, the computational cost of such approaches is non-negligible when compared to other retrieval methods. To this end, this paper first empirically investigates the applications of knowledge distillation models on document ranking task. In addition, on top of the recent TinyBERT, two simplifications are proposed. Evaluation on MS MARCO document re-ranking task confirms the effectiveness of the proposed simplifications.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset