Deep Axial Hypercomplex Networks

01/11/2023
by   Nazmul Shahadat, et al.
0

Over the past decade, deep hypercomplex-inspired networks have enhanced feature extraction for image classification by enabling weight sharing across input channels. Recent works make it possible to improve representational capabilities by using hypercomplex-inspired networks which consume high computational costs. This paper reduces this cost by factorizing a quaternion 2D convolutional module into two consecutive vectormap 1D convolutional modules. Also, we use 5D parameterized hypercomplex multiplication based fully connected layers. Incorporating both yields our proposed hypercomplex network, a novel architecture that can be assembled to construct deep axial-hypercomplex networks (DANs) for image classifications. We conduct experiments on CIFAR benchmarks, SVHN, and Tiny ImageNet datasets and achieve better performance with fewer trainable parameters and FLOPS. Our proposed model achieves almost 2 ImageNet-Tiny dataset and takes six times fewer parameters than the real-valued ResNets. Also, it shows state-of-the-art performance on CIFAR benchmarks in hypercomplex space.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/25/2017

Gradually Updated Neural Networks for Large-Scale Image Recognition

We present a simple yet effective neural network architecture for image ...
research
12/14/2022

Fully complex-valued deep learning model for visual perception

Deep learning models operating in the complex domain are used due to the...
research
02/04/2021

A Deeper Look into Convolutions via Pruning

Convolutional neural networks (CNNs) are able to attain better visual re...
research
10/04/2021

Adding Quaternion Representations to Attention Networks for Classification

This paper introduces a novel modification to axial-attention networks t...
research
01/11/2023

Enhancing ResNet Image Classification Performance by using Parameterized Hypercomplex Multiplication

Recently, many deep networks have introduced hypercomplex and related ca...
research
08/28/2023

Entropy-based Guidance of Deep Neural Networks for Accelerated Convergence and Improved Performance

Neural networks have dramatically increased our capacity to learn from l...
research
11/10/2022

MGiaD: Multigrid in all dimensions. Efficiency and robustness by coarsening in resolution and channel dimensions

Current state-of-the-art deep neural networks for image classification a...

Please sign up or login with your details

Forgot password? Click here to reset