A Novel Perspective to Look At Attention: Bi-level Attention-based Explainable Topic Modeling for News Classification

03/14/2022
by   Dairui Liu, et al.
0

Many recent deep learning-based solutions have widely adopted the attention-based mechanism in various tasks of the NLP discipline. However, the inherent characteristics of deep learning models and the flexibility of the attention mechanism increase the models' complexity, thus leading to challenges in model explainability. In this paper, to address this challenge, we propose a novel practical framework by utilizing a two-tier attention architecture to decouple the complexity of explanation and the decision-making process. We apply it in the context of a news article classification task. The experiments on two large-scaled news corpora demonstrate that the proposed model can achieve competitive performance with many state-of-the-art alternatives and illustrate its appropriateness from an explainability perspective.

READ FULL TEXT
research
09/19/2023

QXAI: Explainable AI Framework for Quantitative Analysis in Patient Monitoring Systems

Artificial Intelligence techniques can be used to classify a patient's p...
research
02/28/2023

Multi-Layer Attention-Based Explainability via Transformers for Tabular Data

We propose a graph-oriented attention-based explainability method for ta...
research
01/20/2019

Deep Features Analysis with Attention Networks

Deep neural network models have recently draw lots of attention, as it c...
research
05/19/2020

Staying True to Your Word: (How) Can Attention Become Explanation?

The attention mechanism has quickly become ubiquitous in NLP. In additio...
research
04/26/2021

Attention vs non-attention for a Shapley-based explanation method

The field of explainable AI has recently seen an explosion in the number...
research
02/17/2019

Attention-Based Prototypical Learning Towards Interpretable, Confident and Robust Deep Neural Networks

We propose a new framework for prototypical learning that bases decision...
research
01/09/2022

λ-Scaled-Attention: A Novel Fast Attention Mechanism for Efficient Modeling of Protein Sequences

Attention-based deep networks have been successfully applied on textual ...

Please sign up or login with your details

Forgot password? Click here to reset