BearingPGA-Net: A Lightweight and Deployable Bearing Fault Diagnosis Network via Decoupled Knowledge Distillation and FPGA Acceleration

07/31/2023
by   Jing-Xiao Liao, et al.
0

Deep learning has achieved remarkable success in the field of bearing fault diagnosis. However, this success comes with larger models and more complex computations, which cannot be transferred into industrial fields requiring models to be of high speed, strong portability, and low power consumption. In this paper, we propose a lightweight and deployable model for bearing fault diagnosis, referred to as BearingPGA-Net, to address these challenges. Firstly, aided by a well-trained large model, we train BearingPGA-Net via decoupled knowledge distillation. Despite its small size, our model demonstrates excellent fault diagnosis performance compared to other lightweight state-of-the-art methods. Secondly, we design an FPGA acceleration scheme for BearingPGA-Net using Verilog. This scheme involves the customized quantization and designing programmable logic gates for each layer of BearingPGA-Net on the FPGA, with an emphasis on parallel computing and module reuse to enhance the computational speed. To the best of our knowledge, this is the first instance of deploying a CNN-based bearing fault diagnosis model on an FPGA. Experimental results reveal that our deployment scheme achieves over 200 times faster diagnosis speed compared to CPU, while achieving a lower-than-0.4% performance drop in terms of F1, Recall, and Precision score on our independently-collected bearing dataset. Our code is available at <https://github.com/asdvfghg/BearingPGA-Net>.

READ FULL TEXT

page 1

page 3

page 4

page 5

page 6

page 7

page 8

page 10

research
12/01/2018

On Compressing U-net Using Knowledge Distillation

We study the use of knowledge distillation to compress the U-net archite...
research
04/25/2023

Class Attention Transfer Based Knowledge Distillation

Previous knowledge distillation methods have shown their impressive perf...
research
02/12/2023

SCLIFD:Supervised Contrastive Knowledge Distillation for Incremental Fault Diagnosis under Limited Fault Data

Intelligent fault diagnosis has made extraordinary advancements currentl...
research
04/09/2019

Ultrafast Video Attention Prediction with Coupled Knowledge Distillation

Large convolutional neural network models have recently demonstrated imp...
research
04/22/2023

Lightweight Toxicity Detection in Spoken Language: A Transformer-based Approach for Edge Devices

Toxicity is a prevalent social behavior that involves the use of hate sp...
research
04/23/2022

Logistic-ELM: A Novel Fault Diagnosis Method for Rolling Bearings

The fault diagnosis of rolling bearings is a critical technique to reali...

Please sign up or login with your details

Forgot password? Click here to reset