Rethinking Data Augmentation for Tabular Data in Deep Learning

05/17/2023
by   Soma Onishi, et al.
0

Tabular data is the most widely used data format in machine learning (ML). While tree-based methods outperform DL-based methods in supervised learning, recent literature reports that self-supervised learning with Transformer-based models outperforms tree-based methods. In the existing literature on self-supervised learning for tabular data, contrastive learning is the predominant method. In contrastive learning, data augmentation is important to generate different views. However, data augmentation for tabular data has been difficult due to the unique structure and high complexity of tabular data. In addition, three main components are proposed together in existing methods: model structure, self-supervised learning methods, and data augmentation. Therefore, previous works have compared the performance without comprehensively considering these components, and it is not clear how each component affects the actual performance. In this study, we focus on data augmentation to address these issues. We propose a novel data augmentation method, Mask Token Replacement (), which replaces the mask token with a portion of each tokenized column; takes advantage of the properties of Transformer, which is becoming the predominant DL-based architecture for tabular data, to perform data augmentation for each column embedding. Through experiments with 13 diverse public datasets in both supervised and self-supervised learning scenarios, we show that achieves competitive performance against existing data augmentation methods and improves model performance. In addition, we discuss specific scenarios in which is most effective and identify the scope of its application. The code is available at https://github.com/somaonishi/MTR/.

READ FULL TEXT

page 15

page 16

research
04/21/2022

Learnable Model Augmentation Self-Supervised Learning for Sequential Recommendation

Sequential Recommendation aims to predict the next item based on user be...
research
03/02/2023

Rethinking the Effect of Data Augmentation in Adversarial Contrastive Learning

Recent works have shown that self-supervised learning can achieve remark...
research
05/22/2023

Tied-Augment: Controlling Representation Similarity Improves Data Augmentation

Data augmentation methods have played an important role in the recent ad...
research
09/18/2023

Contrastive Learning and Data Augmentation in Traffic Classification Using a Flowpic Input Representation

Over the last years we witnessed a renewed interest towards Traffic Clas...
research
08/23/2021

Jointly Learnable Data Augmentations for Self-Supervised GNNs

Self-supervised Learning (SSL) aims at learning representations of objec...
research
10/13/2020

Measuring Visual Generalization in Continuous Control from Pixels

Self-supervised learning and data augmentation have significantly reduce...
research
07/16/2022

On the Importance of Hyperparameters and Data Augmentation for Self-Supervised Learning

Self-Supervised Learning (SSL) has become a very active area of Deep Lea...

Please sign up or login with your details

Forgot password? Click here to reset