Deep Mixture of Experts via Shallow Embedding

06/05/2018
by   Xin Wang, et al.
0

Larger networks generally have greater representational power at the cost of increased computational complexity. Sparsifying such networks has been an active area of research but has been generally limited to static regularization or dynamic approaches using reinforcement learning. We explore a mixture of experts (MoE) approach to deep dynamic routing, which activates certain experts in the network on a per-example basis. Our novel DeepMoE architecture increases the representational power of standard convolutional networks by adaptively sparsifying and recalibrating channel-wise features in each convolutional layer. We employ a multi-headed sparse gating network to determine the selection and scaling of channels for each input, leveraging exponential combinations of experts within a single convolutional network. Our proposed architecture is evaluated on several benchmark datasets and tasks and we show that DeepMoEs are able to achieve higher accuracy with lower computation than standard convolutional networks.

READ FULL TEXT
research
11/19/2015

Mediated Experts for Deep Convolutional Networks

We present a new supervised architecture termed Mediated Mixture-of-Expe...
research
11/26/2017

SkipNet: Learning Dynamic Routing in Convolutional Networks

Increasing depth and complexity in convolutional neural networks has ena...
research
04/10/2019

Soft Conditional Computation

Conditional computation aims to increase the size and accuracy of a netw...
research
05/04/2022

Optimizing Mixture of Experts using Dynamic Recompilations

The Mixture of Experts architecture allows for outrageously large neural...
research
05/07/2021

SpeechMoE: Scaling to Large Acoustic Models with Dynamic Routing Mixture of Experts

Recently, Mixture of Experts (MoE) based Transformer has shown promising...
research
01/06/2023

AdaEnsemble: Learning Adaptively Sparse Structured Ensemble Network for Click-Through Rate Prediction

Learning feature interactions is crucial to success for large-scale CTR ...
research
04/22/2019

Inner-Imaging Convolutional Networks

Despite the tremendous success in computer vision, deep convolutional ne...

Please sign up or login with your details

Forgot password? Click here to reset