Table Search Using a Deep Contextualized Language Model

05/19/2020
by   Zhiyu Chen, et al.
0

Pretrained contextualized language models such as BERT have achieved impressive results on various natural language processing benchmarks. Benefiting from multiple pretraining tasks and large scale training corpora, pretrained models can capture complex syntactic word relations. In this paper, we use the deep contextualized language model BERT for the task of ad hoc table retrieval. We investigate how to encode table content considering the structure and input length limit of BERT. We also propose an approach that incorporates features from prior literature on table retrieval and jointly trains them with BERT. In experiments on public datasets, we show that our best approach can outperform the previous state-of-the-art method and BERT baselines with a large margin under different evaluation metrics.

READ FULL TEXT

page 11

page 12

research
08/22/2021

UzBERT: pretraining a BERT model for Uzbek

Pretrained language models based on the Transformer architecture have ac...
research
04/15/2019

Contextualized Word Representations for Document Re-Ranking

Although considerable attention has been given to neural ranking archite...
research
07/17/2022

Natural language processing for clusterization of genes according to their functions

There are hundreds of methods for analysis of data obtained in mRNA-sequ...
research
02/04/2019

A Comprehensive Exploration on WikiSQL with Table-Aware Word Contextualization

WikiSQL is the task of mapping a natural language question to a SQL quer...
research
02/25/2021

BERT-based Acronym Disambiguation with Multiple Training Strategies

Acronym disambiguation (AD) task aims to find the correct expansions of ...
research
07/14/2022

Language Modelling with Pixels

Language models are defined over a finite set of inputs, which creates a...
research
01/02/2021

Superbizarre Is Not Superb: Improving BERT's Interpretations of Complex Words with Derivational Morphology

How does the input segmentation of pretrained language models (PLMs) aff...

Please sign up or login with your details

Forgot password? Click here to reset