PLUME: Polyhedral Learning Using Mixture of Experts

04/22/2019
by   Kulin Shah, et al.
0

In this paper, we propose a novel mixture of expert architecture for learning polyhedral classifiers. We learn the parameters of the classifierusing an expectation maximization algorithm. Wederive the generalization bounds of the proposedapproach. Through an extensive simulation study, we show that the proposed method performs comparably to other state-of-the-art approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/12/2018

Expectation-Maximization for Adaptive Mixture Models in Graph Optimization

Non-Gaussian and multimodal distributions are an important part of many ...
research
02/23/2022

An expectation-maximization algorithm for estimating the parameters of the correlated binomial distribution

The correlated binomial (CB) distribution was proposed by Luceño (Comput...
research
03/18/2019

Hierarchical Routing Mixture of Experts

In regression tasks the distribution of the data is often too complex to...
research
09/29/2020

Identification of Probability weighted ARX models with arbitrary domains

Hybrid system identification is a key tool to achieve reliable models of...
research
05/12/2020

Machine learning based digital twin for dynamical systems with multiple time-scales

Digital twin technology has a huge potential for widespread applications...
research
07/09/2021

Lifelong Mixture of Variational Autoencoders

In this paper, we propose an end-to-end lifelong learning mixture of exp...
research
06/20/2020

Scalable Identification of Partially Observed Systems with Certainty-Equivalent EM

System identification is a key step for model-based control, estimator d...

Please sign up or login with your details

Forgot password? Click here to reset