Deep Residual Compensation Convolutional Network without Backpropagation

01/27/2023
by   Mubarakah Alotaibi, et al.
0

PCANet and its variants provided good accuracy results for classification tasks. However, despite the importance of network depth in achieving good classification accuracy, these networks were trained with a maximum of nine layers. In this paper, we introduce a residual compensation convolutional network, which is the first PCANet-like network trained with hundreds of layers while improving classification accuracy. The design of the proposed network consists of several convolutional layers, each followed by post-processing steps and a classifier. To correct the classification errors and significantly increase the network's depth, we train each layer with new labels derived from the residual information of all its preceding layers. This learning mechanism is accomplished by traversing the network's layers in a single forward pass without backpropagation or gradient computations. Our experiments on four distinct classification benchmarks (MNIST, CIFAR-10, CIFAR-100, and TinyImageNet) show that our deep network outperforms all existing PCANet-like networks and is competitive with several traditional gradient-based models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/14/2017

The Reversible Residual Network: Backpropagation Without Storing Activations

Deep residual networks (ResNets) have significantly pushed forward the s...
research
02/28/2017

ShaResNet: reducing residual network parameter number by sharing weights

Deep Residual Networks have reached the state of the art in many image p...
research
07/18/2021

Residual Attention Based Network for Automatic Classification of Phonation Modes

Phonation mode is an essential characteristic of singing style as well a...
research
05/28/2016

Weighted Residuals for Very Deep Networks

Deep residual networks have recently shown appealing performance on many...
research
06/10/2019

Network Implosion: Effective Model Compression for ResNets via Static Layer Pruning and Retraining

Residual Networks with convolutional layers are widely used in the field...
research
09/03/2020

Penalty and Augmented Lagrangian Methods for Layer-parallel Training of Residual Networks

Algorithms for training residual networks (ResNets) typically require fo...
research
05/25/2022

Residual-Concatenate Neural Network with Deep Regularization Layers for Binary Classification

Many complex Deep Learning models are used with different variations for...

Please sign up or login with your details

Forgot password? Click here to reset