DeepAI AI Chat
Log In Sign Up

Convergence Rates for Gaussian Mixtures of Experts

07/09/2019
by   Nhat Ho, et al.
6

We provide a theoretical treatment of over-specified Gaussian mixtures of experts with covariate-free gating networks. We establish the convergence rates of the maximum likelihood estimation (MLE) for these models. Our proof technique is based on a novel notion of algebraic independence of the expert functions. Drawing on optimal transport theory, we establish a connection between the algebraic independence and a certain class of partial differential equations (PDEs). Exploiting this connection allows us to derive convergence rates and minimax lower bounds for parameter estimation.

READ FULL TEXT

page 1

page 2

page 3

page 4

09/09/2016

Singularity structures and impacts on parameter estimation in finite mixtures of distributions

Singularities of a statistical model are the elements of the model's par...
10/10/2011

Convergence Rates for Mixture-of-Experts

In mixtures-of-experts (ME) model, where a number of submodels (experts)...
06/01/2020

Uniform Convergence Rates for Maximum Likelihood Estimation under Two-Component Gaussian Mixture Models

We derive uniform convergence rates for the maximum likelihood estimator...
05/29/2023

Optimal approximation of infinite-dimensional holomorphic functions

Over the last decade, approximating functions in infinite dimensions fro...
02/28/2019

Learning rates for Gaussian mixtures under group invariance

We study the pointwise maximum likelihood estimation rates for a class o...
07/07/2021

Probabilistic partition of unity networks: clustering based deep approximation

Partition of unity networks (POU-Nets) have been shown capable of realiz...