Perturbation of Deep Autoencoder Weights for Model Compression and Classification of Tabular Data

05/17/2022
by   Manar Samad, et al.
0

Fully connected deep neural networks (DNN) often include redundant weights leading to overfitting and high memory requirements. Additionally, the performance of DNN is often challenged by traditional machine learning models in tabular data classification. In this paper, we propose periodical perturbations (prune and regrow) of DNN weights, especially at the self-supervised pre-training stage of deep autoencoders. The proposed weight perturbation strategy outperforms dropout learning in four out of six tabular data sets in downstream classification tasks. The L1 or L2 regularization of weights at the same pretraining stage results in inferior classification performance compared to dropout or our weight perturbation routine. Unlike dropout learning, the proposed weight perturbation routine additionally achieves 15 of deep pretrained models. Our experiments reveal that a pretrained deep autoencoder with weight perturbation or dropout can outperform traditional machine learning in tabular data classification when fully connected DNN fails miserably. However, traditional machine learning models appear superior to any deep models when a tabular data set contains uncorrelated variables. Therefore, the success of deep models can be attributed to the inevitable presence of correlated variables in real-world data sets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/21/2018

Bridgeout: stochastic bridge regularization for deep neural networks

A major challenge in training deep neural networks is overfitting, i.e. ...
research
07/09/2021

Dropout Regularization for Self-Supervised Learning of Transformer Encoder Speech Representation

Predicting the altered acoustic frames is an effective way of self-super...
research
02/14/2016

Surprising properties of dropout in deep networks

We analyze dropout in deep networks with rectified linear units and the ...
research
01/15/2022

Training Fair Deep Neural Networks by Balancing Influence

Most fair machine learning methods either highly rely on the sensitive i...
research
07/16/2020

Learning perturbation sets for robust machine learning

Although much progress has been made towards robust deep learning, a sig...
research
06/09/2023

Weight Freezing: A Regularization Approach for Fully Connected Layers with an Application in EEG Classification

In the realm of EEG decoding, enhancing the performance of artificial ne...
research
05/07/2022

Impact of L1 Batch Normalization on Analog Noise Resistant Property of Deep Learning Models

Analog hardware has become a popular choice for machine learning on reso...

Please sign up or login with your details

Forgot password? Click here to reset