Regularized Maximum Likelihood Estimation and Feature Selection in Mixtures-of-Experts Models

10/29/2018
by   Faicel Chamroukhi, et al.
6

Mixture of Experts (MoE) are successful models for modeling heterogeneous data in many statistical learning problems including regression, clustering and classification. Generally fitted by maximum likelihood estimation via the well-known EM algorithm, their application to high-dimensional problems is still therefore challenging. We consider the problem of fitting and feature selection in MoE models, and propose a regularized maximum likelihood estimation approach that encourages sparse solutions for heterogeneous regression data models with potentially high-dimensional predictors. Unlike state-of-the art regularized MLE for MoE, the proposed modelings do not require an approximate of the penalty function. We develop two hybrid EM algorithms: an Expectation-Majorization-Maximization (EM/MM) algorithm, and an EM algorithm with coordinate ascent algorithm. The proposed algorithms allow to automatically obtaining sparse solutions without thresholding, and avoid matrix inversion by allowing univariate parameter updates. An experimental study shows the good performance of the algorithms in terms of recovering the actual sparse solutions, parameter estimation, and clustering of heterogeneous regression data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/14/2019

Estimation and Feature Selection in Mixtures of Generalized Linear Experts Models

Mixtures-of-Experts (MoE) are conditional mixture models that have shown...
research
09/12/2019

Regularized Estimation and Feature Selection in Mixtures of Gaussian-Gated Experts Models

Mixtures-of-Experts models and their maximum likelihood estimation (MLE)...
research
03/27/2019

Stable prediction with radiomics data

Motivation: Radiomics refers to the high-throughput mining of quantitati...
research
03/24/2021

Statistical Integration of Heterogeneous Data with PO2PLS

The availability of multi-omics data has revolutionized the life science...
research
02/28/2022

Functional mixture-of-experts for classification

We develop a mixtures-of-experts (ME) approach to the multiclass classif...
research
08/05/2015

A MAP approach for ℓ_q-norm regularized sparse parameter estimation using the EM algorithm

In this paper, Bayesian parameter estimation through the consideration o...
research
10/14/2009

L_0 regularized estimation for nonlinear models that have sparse underlying linear structures

We study the estimation of β for the nonlinear model y = f(Xβ) + ϵ when ...

Please sign up or login with your details

Forgot password? Click here to reset