DeepAI AI Chat
Log In Sign Up

Adversarially robust and explainable model compression with on-device personalization for NLP applications

01/10/2021
by   Yao Qiang, et al.
Wayne State University
0

On-device Deep Neural Networks (DNNs) have recently gained more attention due to the increasing computing power of the mobile devices and the number of applications in Computer Vision (CV), Natural Language Processing (NLP), and Internet of Things (IoTs). Unfortunately, the existing efficient convolutional neural network (CNN) architectures designed for CV tasks are not directly applicable to NLP tasks and the tiny Recurrent Neural Network (RNN) architectures have been designed primarily for IoT applications. In NLP applications, although model compression has seen initial success in on-device text classification, there are at least three major challenges yet to be addressed: adversarial robustness, explainability, and personalization. Here we attempt to tackle these challenges by designing a new training scheme for model compression and adversarial robustness, including the optimization of an explainable feature mapping objective, a knowledge distillation objective, and an adversarially robustness objective. The resulting compressed model is personalized using on-device private training data via fine-tuning. We perform extensive experiments to compare our approach with both compact RNN (e.g., FastGRNN) and compressed RNN (e.g., PRADO) architectures in both natural and adversarial NLP test settings.

READ FULL TEXT
02/07/2017

Comparative Study of CNN and RNN for Natural Language Processing

Deep neural networks (DNN) have revolutionized the field of natural lang...
03/12/2022

A Survey in Adversarial Defences and Robustness in NLP

In recent years, it has been seen that deep neural networks are lacking ...
08/28/2021

DKM: Differentiable K-Means Clustering Layer for Neural Network Compression

Deep neural network (DNN) model compression for efficient on-device infe...
11/13/2018

Private Model Compression via Knowledge Distillation

The soaring demand for intelligent mobile applications calls for deployi...
09/23/2020

A Token-wise CNN-based Method for Sentence Compression

Sentence compression is a Natural Language Processing (NLP) task aimed a...
08/05/2022

Model Blending for Text Classification

Deep neural networks (DNNs) have proven successful in a wide variety of ...