Enhancing ResNet Image Classification Performance by using Parameterized Hypercomplex Multiplication

01/11/2023
by   Nazmul Shahadat, et al.
0

Recently, many deep networks have introduced hypercomplex and related calculations into their architectures. In regard to convolutional networks for classification, these enhancements have been applied to the convolution operations in the frontend to enhance accuracy and/or reduce the parameter requirements while maintaining accuracy. Although these enhancements have been applied to the convolutional frontend, it has not been studied whether adding hypercomplex calculations improves performance when applied to the densely connected backend. This paper studies ResNet architectures and incorporates parameterized hypercomplex multiplication (PHM) into the backend of residual, quaternion, and vectormap convolutional neural networks to assess the effect. We show that PHM does improve classification accuracy performance on several image datasets, including small, low-resolution CIFAR 10/100 and large high-resolution ImageNet and ASL, and can achieve state-of-the-art accuracy for hypercomplex networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2023

High-Resolution Convolutional Neural Networks on Homomorphically Encrypted Data via Sharding Ciphertexts

Recently, Deep Convolutional Neural Networks (DCNNs) including the ResNe...
research
05/05/2022

Biologically inspired deep residual networks for computer vision applications

Deep neural network has been ensured as a key technology in the field of...
research
04/23/2019

DenseNet Models for Tiny ImageNet Classification

In this paper, we present two image classification models on the Tiny Im...
research
09/15/2020

ResNet-like Architecture with Low Hardware Requirements

One of the most computationally intensive parts in modern recognition sy...
research
06/17/2021

ShuffleBlock: Shuffle to Regularize Deep Convolutional Neural Networks

Deep neural networks have enormous representational power which leads th...
research
01/11/2023

Deep Axial Hypercomplex Networks

Over the past decade, deep hypercomplex-inspired networks have enhanced ...
research
03/07/2017

Sharing Residual Units Through Collective Tensor Factorization in Deep Neural Networks

Residual units are wildly used for alleviating optimization difficulties...

Please sign up or login with your details

Forgot password? Click here to reset