TransTab: Learning Transferable Tabular Transformers Across Tables

05/19/2022
by   Zifeng Wang, et al.
0

Tabular data (or tables) are the most widely used data format in machine learning (ML). However, ML models often assume the table structure keeps fixed in training and testing. Before ML modeling, heavy data cleaning is required to merge disparate tables with different columns. This preprocessing often incurs significant data waste (e.g., removing unmatched columns and samples). How to learn ML models from multiple tables with partially overlapping columns? How to incrementally update ML models as more columns become available over time? Can we leverage model pretraining on multiple distinct tables? How to train an ML model which can predict on an unseen table? To answer all those questions, we propose to relax fixed table structures by introducing a Transferable Tabular Transformer (TransTab) for tables. The goal of TransTab is to convert each sample (a row in the table) to a generalizable embedding vector, and then apply stacked transformers for feature encoding. One methodology insight is combining column description and table cells as the raw input to a gated transformer model. The other insight is to introduce supervised and self-supervised pretraining to improve model performance. We compare TransTab with multiple baseline methods on diverse benchmark datasets and five oncology clinical trial datasets. Overall, TransTab ranks 1.00, 1.00, 1.78 out of 12 methods in supervised learning, feature incremental learning, and transfer learning scenarios, respectively; and the proposed pretraining leads to 2.3% AUC lift on average over the supervised learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/10/2023

XTab: Cross-table Pretraining for Tabular Transformers

The success of self-supervised learning in computer vision and natural l...
research
07/18/2023

UniTabE: Pretraining a Unified Tabular Encoder for Heterogeneous Tabular Data

Recent advancements in Natural Language Processing (NLP) have witnessed ...
research
11/04/2021

Benchmarking Multimodal AutoML for Tabular Data with Text Fields

We consider the use of automated supervised learning systems for data ta...
research
05/06/2021

TABBIE: Pretrained Representations of Tabular Data

Existing work on tabular representation learning jointly models tables a...
research
02/20/2023

Compressing Tabular Data via Latent Variable Estimation

Data used for analytics and machine learning often take the form of tabl...
research
07/14/2023

HYTREL: Hypergraph-enhanced Tabular Data Representation Learning

Language models pretrained on large collections of tabular data have dem...
research
03/01/2023

Aligning benchmark datasets for table structure recognition

Benchmark datasets for table structure recognition (TSR) must be careful...

Please sign up or login with your details

Forgot password? Click here to reset