Table2Vec: Neural Word and Entity Embeddings for Table Population and Retrieval

05/31/2019
by   Li Deng, et al.
0

Tables contain valuable knowledge in a structured form. We employ neural language modeling approaches to embed tabular data into vector spaces. Specifically, we consider different table elements, such caption, column headings, and cells, for training word and entity embeddings. These embeddings are then utilized in three particular table-related tasks, row population, column population, and table retrieval, by incorporating them into existing retrieval models as additional semantic similarity signals. Evaluation results show that table embeddings can significantly improve upon the performance of state-of-the-art baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/13/2021

Semantic Table Retrieval using Keyword and Table Queries

Tables on the Web contain a vast amount of knowledge in a structured for...
research
05/19/2022

Table Retrieval May Not Necessitate Table-specific Model Design

Tables are an important form of structured data for both human and machi...
research
09/08/2023

Matching Table Metadata with Business Glossaries Using Large Language Models

Enterprises often own large collections of structured data in the form o...
research
07/14/2023

HYTREL: Hypergraph-enhanced Tabular Data Representation Learning

Language models pretrained on large collections of tabular data have dem...
research
07/05/2022

Entity Linking in Tabular Data Needs the Right Attention

Understanding the semantic meaning of tabular data requires Entity Linki...
research
10/31/2016

LightRNN: Memory and Computation-Efficient Recurrent Neural Networks

Recurrent neural networks (RNNs) have achieved state-of-the-art performa...
research
08/11/2021

Retrieval Interaction Machine for Tabular Data Prediction

Prediction over tabular data is an essential task in many data science a...

Please sign up or login with your details

Forgot password? Click here to reset