Self-interpretable Convolutional Neural Networks for Text Classification

05/18/2021
by   Wei Zhao, et al.
43

Deep learning models for natural language processing (NLP) are inherently complex and often viewed as black box in nature. This paper develops an approach for interpreting convolutional neural networks for text classification problems by exploiting the local-linear models inherent in ReLU-DNNs. The CNN model combines the word embedding through convolutional layers, filters them using max-pooling, and optimizes using a ReLU-DNN for classification. To get an overall self-interpretable model, the system of local linear models from the ReLU DNN are mapped back through the max-pool filter to the appropriate n-grams. Our results on experimental datasets demonstrate that our proposed technique produce parsimonious models that are self-interpretable and have comparable performance with respect to a more complex CNN model. We also study the impact of the complexity of the convolutional layers and the classification layers on the model performance.

READ FULL TEXT
research
05/03/2019

Effectiveness of Self Normalizing Neural Networks for Text Classification

Self Normalizing Neural Networks(SNN) proposed on Feed Forward Neural Ne...
research
08/30/2019

Sequential Learning of Convolutional Features for Effective Text Classification

Text classification has been one of the major problems in natural langua...
research
10/14/2019

Interpretable Text Classification Using CNN and Max-pooling

Deep neural networks have been widely used in text classification. Howev...
research
11/16/2020

Learning Regular Expressions for Interpretable Medical Text Classification Using a Pool-based Simulated Annealing and Word-vector Models

In this paper, we propose a rule-based engine composed of high quality a...
research
11/08/2020

Unwrapping The Black Box of Deep ReLU Networks: Interpretability, Diagnostics, and Simplification

The deep neural networks (DNNs) have achieved great success in learning ...
research
03/11/2022

Exploiting Low-Rank Tensor-Train Deep Neural Networks Based on Riemannian Gradient Descent With Illustrations of Speech Processing

This work focuses on designing low complexity hybrid tensor networks by ...
research
05/23/2021

Precise Approximation of Convolutional Neural Networks for Homomorphically Encrypted Data

Homomorphic encryption is one of the representative solutions to privacy...

Please sign up or login with your details

Forgot password? Click here to reset