The GatedTabTransformer. An enhanced deep learning architecture for tabular modeling

01/01/2022
by   Radostin Cholakov, et al.
0

There is an increasing interest in the application of deep learning architectures to tabular data. One of the state-of-the-art solutions is TabTransformer which incorporates an attention mechanism to better track relationships between categorical features and then makes use of a standard MLP to output its final logits. In this paper we propose multiple modifications to the original TabTransformer performing better on binary classification tasks for three separate datasets with more than 1 MLP, linear projections are implemented in the MLP block and multiple activation functions are tested. We also evaluate the importance of specific hyper parameters during training.

READ FULL TEXT

page 9

page 10

research
09/08/2020

TanhSoft – a family of activation functions combining Tanh and Softplus

Deep learning at its core, contains functions that are composition of a ...
research
11/08/2018

Activation Functions: Comparison of trends in Practice and Research for Deep Learning

Deep neural networks have been successfully used in diverse emerging dom...
research
01/29/2018

Learning Combinations of Activation Functions

In the last decade, an active area of research has been devoted to desig...
research
06/08/2023

Layer-level activation mechanism

In this work, we propose a novel activation mechanism aimed at establish...
research
07/13/2017

Be Careful What You Backpropagate: A Case For Linear Output Activations & Gradient Boosting

In this work, we show that saturating output activation functions, such ...
research
05/30/2021

Evolution of Activation Functions: An Empirical Investigation

The hyper-parameters of a neural network are traditionally designed thro...
research
06/24/2022

Neural Networks with A La Carte Selection of Activation Functions

Activation functions (AFs), which are pivotal to the success (or failure...

Please sign up or login with your details

Forgot password? Click here to reset