Log In Sign Up

Distilling a Deep Neural Network into a Takagi-Sugeno-Kang Fuzzy Inference System

by   Xiangming Gu, et al.

Deep neural networks (DNNs) demonstrate great success in classification tasks. However, they act as black boxes and we don't know how they make decisions in a particular classification task. To this end, we propose to distill the knowledge from a DNN into a fuzzy inference system (FIS), which is Takagi-Sugeno-Kang (TSK)-type in this paper. The model has the capability to express the knowledge acquired by a DNN based on fuzzy rules, thus explaining a particular decision much easier. Knowledge distillation (KD) is applied to create a TSK-type FIS that generalizes better than one directly from the training data, which is guaranteed through experiments in this paper. To further improve the performances, we modify the baseline method of KD and obtain good results.


page 1

page 2

page 3

page 4


Distilling a Neural Network Into a Soft Decision Tree

Deep neural networks have proved to be a very effective way to perform c...

Explaining Knowledge Distillation by Quantifying the Knowledge

This paper presents a method to interpret the success of knowledge disti...

Quantifying the Knowledge in a DNN to Explain Knowledge Distillation for Classification

Compared to traditional learning from scratch, knowledge distillation so...

Self-supervised Knowledge Distillation Using Singular Value Decomposition

To solve deep neural network (DNN)'s huge training dataset and its high ...

A General Multiple Data Augmentation Based Framework for Training Deep Neural Networks

Deep neural networks (DNNs) often rely on massive labelled data for trai...

Distilling Deep RL Models Into Interpretable Neuro-Fuzzy Systems

Deep Reinforcement Learning uses a deep neural network to encode a polic...

Neural Compatibility Modeling with Attentive Knowledge Distillation

Recently, the booming fashion sector and its huge potential benefits hav...