Adding Quaternion Representations to Attention Networks for Classification

10/04/2021
by   Nazmul Shahadat, et al.
0

This paper introduces a novel modification to axial-attention networks to improve their image classification accuracy. The modification involves supplementing axial-attention modules with quaternion input representations to improve image classification accuracy. We chose axial-attention networks because they factor 2D attention operations into two consecutive 1D operations (similar to separable convolution) and are thus less resource intensive than non-axial attention networks. We chose a quaternion encoder because of they share weights across four real-valued input channels and the weight-sharing has been shown to produce a more interlinked/interwoven output representation. We hypothesize that an attention module can be more effective using these interlinked representations as input. Our experiments support this hypothesis as reflected in the improved classification accuracy compared to standard axial-attention networks. We think this happens because the attention modules have better input representations to work with.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/24/2019

Efficiently utilizing complex-valued PolSAR image data via a multi-task deep learning framework

Accompanied by the successful progress of deep representation learning, ...
research
08/07/2019

Advocacy Learning: Learning through Competition and Class-Conditional Representations

We introduce advocacy learning, a novel supervised training scheme for a...
research
01/11/2023

Deep Axial Hypercomplex Networks

Over the past decade, deep hypercomplex-inspired networks have enhanced ...
research
08/21/2023

LDCSF: Local depth convolution-based Swim framework for classifying multi-label histopathology images

Histopathological images are the gold standard for diagnosing liver canc...
research
10/07/2021

Using Contrastive Learning and Pseudolabels to learn representations for Retail Product Image Classification

Retail product Image classification problems are often few shot classifi...
research
06/13/2019

Near-Optimal Glimpse Sequences for Improved Hard Attention Neural Network Training

We introduce the use of Bayesian optimal experimental design techniques ...
research
08/22/2023

Development of a Novel Quantum Pre-processing Filter to Improve Image Classification Accuracy of Neural Network Models

This paper proposes a novel quantum pre-processing filter (QPF) to impro...

Please sign up or login with your details

Forgot password? Click here to reset