A Mixture of Expert Approach for Low-Cost Customization of Deep Neural Networks

10/31/2018
by   Boyu Zhang, et al.
0

The ability to customize a trained Deep Neural Network (DNN) locally using user-specific data may greatly enhance user experiences, reduce development costs, and protect user's privacy. In this work, we propose to incorporate a novel Mixture of Experts (MOE) approach to accomplish this goal. This architecture comprises of a Global Expert (GE), a Local Expert (LE) and a Gating Network (GN). The GE is a trained DNN developed on a large training dataset representative of many potential users. After deployment on an embedded edge device, GE will be subject to customized, user-specific data (e.g., accent in speech) and its performance may suffer. This problem may be alleviated by training a local DNN (the local expert, LE) on a small size customized training data to correct the errors made by GE. A gating network then will be trained to determine whether an incoming data should be handled by GE or LE. Since the customized dataset is in general very small, the cost of training LE and GN would be much lower than that of re-training of GE. The training of LE and GN thus can be performed at local device, properly protecting the privacy of customized training data. In this work, we developed a prototype MOE architecture for handwritten alphanumeric character recognition task. We use EMNIST as the generic dataset, LeNet5 as GE, and handwritings of 10 users as the customized dataset. We show that with the LE and GN, the classification accuracy is significantly enhanced over the customized dataset with almost no degradation of accuracy over the generic dataset. In terms of energy and network size, the overhead of LE and GN is around 2.5

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/11/2021

Speech enhancement with mixture-of-deep-experts with clean clustering pre-training

In this study we present a mixture of deep experts (MoDE) neural-network...
research
02/01/2017

Visual Saliency Prediction Using a Mixture of Deep Neural Networks

Visual saliency models have recently begun to incorporate deep learning ...
research
01/31/2018

Fusarium Damaged Kernels Detection Using Transfer Learning on Deep Neural Network Architecture

The present work shows the application of transfer learning for a pre-tr...
research
09/12/2017

Small-footprint Keyword Spotting Using Deep Neural Network and Connectionist Temporal Classifier

Mainly for the sake of solving the lack of keyword-specific data, we pro...
research
11/23/2021

A Customized NoC Architecture to Enable Highly Localized Computing-On-the-Move DNN Dataflow

The ever-increasing computation complexity of fastgrowing Deep Neural Ne...
research
08/09/2018

Training De-Confusion: An Interactive, Network-Supported Visual Analysis System for Resolving Errors in Image Classification Training Data

Convolutional neural networks gain more and more popularity in image cla...
research
05/18/2020

Modeling extra-deep EM logs using a deep neural network

Modern geosteering is heavily dependent on real-time interpretation of d...

Please sign up or login with your details

Forgot password? Click here to reset