Padding Module: Learning the Padding in Deep Neural Networks

01/11/2023
by   Fahad Alrasheedi, et al.
0

During the last decades, many studies have been dedicated to improving the performance of neural networks, for example, the network architectures, initialization, and activation. However, investigating the importance and effects of learnable padding methods in deep learning remains relatively open. To mitigate the gap, this paper proposes a novel trainable Padding Module that can be placed in a deep learning model. The Padding Module can optimize itself without requiring or influencing the model's entire loss function. To train itself, the Padding Module constructs a ground truth and a predictor from the inputs by leveraging the underlying structure in the input data for supervision. As a result, the Padding Module can learn automatically to pad pixels to the border of its input images or feature maps. The padding contents are realistic extensions to its input data and simultaneously facilitate the deep learning model's downstream task. Experiments have shown that the proposed Padding Module outperforms the state-of-the-art competitors and the baseline methods. For example, the Padding Module has 1.23 classification accuracy than the zero padding when tested on the VGG16 and ResNet50.

READ FULL TEXT

page 1

page 7

page 9

research
06/05/2015

Spatial Transformer Networks

Convolutional Neural Networks define an exceptionally powerful class of ...
research
05/30/2022

CHALLENGER: Training with Attribution Maps

We show that utilizing attribution maps for training neural networks can...
research
05/09/2019

Learning Loss for Active Learning

The performance of deep neural networks improves with more annotated dat...
research
07/17/2018

BAM: Bottleneck Attention Module

Recent advances in deep neural networks have been developed via architec...
research
05/13/2023

MetaMorphosis: Task-oriented Privacy Cognizant Feature Generation for Multi-task Learning

With the growth of computer vision applications, deep learning, and edge...
research
07/09/2021

Autoencoder-driven Spiral Representation Learning for Gravitational Wave Surrogate Modelling

Recently, artificial neural networks have been gaining momentum in the f...
research
01/12/2021

A SOM-based Gradient-Free Deep Learning Method with Convergence Analysis

As gradient descent method in deep learning causes a series of questions...

Please sign up or login with your details

Forgot password? Click here to reset