SubTab: Subsetting Features of Tabular Data for Self-Supervised Representation Learning

10/08/2021
by   Talip Ucar, et al.
0

Self-supervised learning has been shown to be very effective in learning useful representations, and yet much of the success is achieved in data types such as images, audio, and text. The success is mainly enabled by taking advantage of spatial, temporal, or semantic structure in the data through augmentation. However, such structure may not exist in tabular datasets commonly used in fields such as healthcare, making it difficult to design an effective augmentation method, and hindering a similar progress in tabular data setting. In this paper, we introduce a new framework, Subsetting features of Tabular data (SubTab), that turns the task of learning from tabular data into a multi-view representation learning problem by dividing the input features to multiple subsets. We argue that reconstructing the data from the subset of its features rather than its corrupted version in an autoencoder setting can better capture its underlying latent representation. In this framework, the joint representation can be expressed as the aggregate of latent variables of the subsets at test time, which we refer to as collaborative inference. Our experiments show that the SubTab achieves the state of the art (SOTA) performance of 98.31 models, and surpasses existing baselines on three other real-world datasets by a significant margin.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2022

Decoupled Self-supervised Learning for Non-Homophilous Graphs

In this paper, we study the problem of conducting self-supervised learni...
research
09/05/2023

Non-Parametric Representation Learning with Kernels

Unsupervised and self-supervised representation learning has become popu...
research
03/28/2021

Representation Learning by Ranking under multiple tasks

In recent years, representation learning has become the research focus o...
research
06/13/2022

Virtual embeddings and self-consistency for self-supervised learning

Self-supervised Learning (SSL) has recently gained much attention due to...
research
06/08/2021

Self-Supervised Learning with Data Augmentations Provably Isolates Content from Style

Self-supervised representation learning has shown remarkable success in ...
research
06/27/2023

Enhancing Representation Learning on High-Dimensional, Small-Size Tabular Data: A Divide and Conquer Method with Ensembled VAEs

Variational Autoencoders and their many variants have displayed impressi...
research
11/25/2017

Predictive Learning: Using Future Representation Learning Variantial Autoencoder for Human Action Prediction

The unsupervised Pretraining method has been widely used in aiding human...

Please sign up or login with your details

Forgot password? Click here to reset