Randomized Mixture Models for Probability Density Approximation and Estimation

04/23/2018
by   Hien D. Nguyen, et al.
0

Randomized neural networks (NNs) are an interesting alternative to conventional NNs that are more used for data modeling. The random vector functional-link (RVFL) network is an established and theoretically well-grounded randomized learning model. A key theoretical result for RVFL networks is that they provide universal approximation for continuous maps, on average, almost surely. We specialize and modify this result, and show that RFVL networks can provide functional approximations that converge in Kullback-Leibler divergence, when the target function is a probability density function. Expanding on the approximation results, we demonstrate the the RFVL networks lead to a simple randomized mixture model (MM) construction for density estimation from random data. An expectation-maximization (EM) algorithm is derived for the maximum likelihood estimation of our randomized MM. The EM algorithm is proved to be globally convergent and the maximum likelihood estimator is proved to be consistent. A set of simulation studies is given to provide empirical evidence towards our approximation and density estimation results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/18/2021

Maximum Likelihood Recursive State Estimation using the Expectation Maximization Algorithm

A Maximum Likelihood recursive state estimator is derived for non-linear...
research
03/19/2023

Mixture of segmentation for heterogeneous functional data

In this paper we consider functional data with heterogeneity in time and...
research
04/11/2018

CoT: Cooperative Training for Generative Modeling

We propose Cooperative Training (CoT) for training generative models tha...
research
03/24/2021

Mixture Density Network Estimation of Continuous Variable Maximum Likelihood Using Discrete Training Samples

Mixture Density Networks (MDNs) can be used to generate probability dens...
research
01/03/2016

A Unified Approach for Learning the Parameters of Sum-Product Networks

We present a unified approach for learning the parameters of Sum-Product...
research
08/03/2021

Maximum weighted likelihood estimator for robust heavy-tail modelling of finite mixture models

In this article, we present the maximum weighted likelihood estimator (M...
research
12/04/2020

Approximations of conditional probability density functions in Lebesgue spaces via mixture of experts models

Mixture of experts (MoE) models are widely applied for conditional proba...

Please sign up or login with your details

Forgot password? Click here to reset