Biased Mixtures Of Experts: Enabling Computer Vision Inference Under Data Transfer Limitations

08/21/2020
by   Alhabib Abbas, et al.
0

We propose a novel mixture-of-experts class to optimize computer vision models in accordance with data transfer limitations at test time. Our approach postulates that the minimum acceptable amount of data allowing for highly-accurate results can vary for different input space partitions. Therefore, we consider mixtures where experts require different amounts of data, and train a sparse gating function to divide the input space for each expert. By appropriate hyperparameter selection, our approach is able to bias mixtures of experts towards selecting specific experts over others. In this way, we show that the data transfer optimization between visual sensing and processing can be solved as a convex optimization problem.To demonstrate the relation between data availability and performance, we evaluate biased mixtures on a range of mainstream computer vision problems, namely: (i) single shot detection, (ii) image super resolution, and (iii) realtime video action classification. For all cases, and when experts constitute modified baselines to meet different limits on allowed data utility, biased mixtures significantly outperform previous work optimized to meet the same constraints on available data.

READ FULL TEXT

page 1

page 8

research
12/16/2013

Learning Factored Representations in a Deep Mixture of Experts

Mixtures of Experts combine the outputs of several "expert" networks, ea...
research
06/21/2018

Mixtures of Experts Models

Mixtures of experts models provide a framework in which covariates may b...
research
06/10/2021

Scaling Vision with Sparse Mixture of Experts

Sparsely-gated Mixture of Experts networks (MoEs) have demonstrated exce...
research
02/09/2023

Gaussian Process-Gated Hierarchical Mixtures of Experts

In this paper, we propose novel Gaussian process-gated hierarchical mixt...
research
05/30/2019

Enriched Mixtures of Gaussian Process Experts

Mixtures of experts probabilistically divide the input space into region...
research
04/11/2023

Revisiting Single-gated Mixtures of Experts

Mixture of Experts (MoE) are rising in popularity as a means to train ex...
research
10/07/2022

Few-Shot Anaphora Resolution in Scientific Protocols via Mixtures of In-Context Experts

Anaphora resolution is an important task for information extraction acro...

Please sign up or login with your details

Forgot password? Click here to reset