Convergence Rates for Gaussian Mixtures of Experts

07/09/2019
by   Nhat Ho, et al.
6

We provide a theoretical treatment of over-specified Gaussian mixtures of experts with covariate-free gating networks. We establish the convergence rates of the maximum likelihood estimation (MLE) for these models. Our proof technique is based on a novel notion of algebraic independence of the expert functions. Drawing on optimal transport theory, we establish a connection between the algebraic independence and a certain class of partial differential equations (PDEs). Exploiting this connection allows us to derive convergence rates and minimax lower bounds for parameter estimation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/09/2016

Singularity structures and impacts on parameter estimation in finite mixtures of distributions

Singularities of a statistical model are the elements of the model's par...
research
10/10/2011

Convergence Rates for Mixture-of-Experts

In mixtures-of-experts (ME) model, where a number of submodels (experts)...
research
06/01/2020

Uniform Convergence Rates for Maximum Likelihood Estimation under Two-Component Gaussian Mixture Models

We derive uniform convergence rates for the maximum likelihood estimator...
research
05/29/2023

Optimal approximation of infinite-dimensional holomorphic functions

Over the last decade, approximating functions in infinite dimensions fro...
research
07/07/2021

Probabilistic partition of unity networks: clustering based deep approximation

Partition of unity networks (POU-Nets) have been shown capable of realiz...
research
04/11/2022

Local convergence rates of the least squares estimator with applications to transfer learning

Convergence properties of empirical risk minimizers can be conveniently ...
research
09/03/2023

Distribution learning via neural differential equations: a nonparametric statistical perspective

Ordinary differential equations (ODEs), via their induced flow maps, pro...

Please sign up or login with your details

Forgot password? Click here to reset