Functional mixture-of-experts for classification

02/28/2022
by   Nhat-Thien Pham, et al.
0

We develop a mixtures-of-experts (ME) approach to the multiclass classification where the predictors are univariate functions. It consists of a ME model in which both the gating network and the experts network are constructed upon multinomial logistic activation functions with functional inputs. We perform a regularized maximum likelihood estimation in which the coefficient functions enjoy interpretable sparsity constraints on targeted derivatives. We develop an EM-Lasso like algorithm to compute the regularized MLE and evaluate the proposed approach on simulated and real data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/12/2019

Regularized Estimation and Feature Selection in Mixtures of Gaussian-Gated Experts Models

Mixtures-of-Experts models and their maximum likelihood estimation (MLE)...
research
02/04/2022

Functional Mixtures-of-Experts

We consider the statistical analysis of heterogeneous data for clusterin...
research
10/29/2018

Regularized Maximum Likelihood Estimation and Feature Selection in Mixtures-of-Experts Models

Mixture of Experts (MoE) are successful models for modeling heterogeneou...
research
07/14/2019

Estimation and Feature Selection in Mixtures of Generalized Linear Experts Models

Mixtures-of-Experts (MoE) are conditional mixture models that have shown...
research
03/19/2023

Mixture of segmentation for heterogeneous functional data

In this paper we consider functional data with heterogeneity in time and...
research
12/04/2020

Approximations of conditional probability density functions in Lebesgue spaces via mixture of experts models

Mixture of experts (MoE) models are widely applied for conditional proba...

Please sign up or login with your details

Forgot password? Click here to reset