Self-attention based BiLSTM-CNN classifier for the prediction of ischemic and non-ischemic cardiomyopathy

07/24/2019
by   Kavita Dubey, et al.
9

Approximately 26 million individuals are suffering from heart failure, according to the global annual report. Despite higher inter-rater variability, endomyocardial biopsy (EMB) is still regarded the gold standard for assessing heart failure. Therefore, we proposed and implemented a new unified architecture consist of convolutional layers, bidirectional LSTM (BiLSTM), and self-attention mechanism to predict the ischemic and non-ischemic cardiomyopathy using histopathological images. The proposed model is based on self-attention that implicitly focus to the information outputted from the hidden layers of BiLSTM. Through our results we demonstrate that this framework carries high learning capacity and is able to improve the classification performance.

READ FULL TEXT

page 4

page 5

page 8

research
06/12/2019

Toward Interpretable Music Tagging with Self-Attention

Self-attention is an attention mechanism that learns a representation by...
research
11/05/2019

Self-Attention and Ingredient-Attention Based Model for Recipe Retrieval from Image Queries

Direct computer vision based-nutrient content estimation is a demanding ...
research
04/29/2019

Self-Attention Capsule Networks for Image Classification

We propose a novel architecture for image classification, called Self-At...
research
03/26/2022

Exploring Self-Attention for Visual Intersection Classification

In robot vision, self-attention has recently emerged as a technique for ...
research
10/18/2021

Finding Strong Gravitational Lenses Through Self-Attention

The upcoming large scale surveys are expected to find approximately 10^5...
research
08/23/2020

TSAM: Temporal Link Prediction in Directed Networks based on Self-Attention Mechanism

The development of graph neural networks (GCN) makes it possible to lear...
research
04/12/2021

GAttANet: Global attention agreement for convolutional neural networks

Transformer attention architectures, similar to those developed for natu...

Please sign up or login with your details

Forgot password? Click here to reset