DeepAI AI Chat
Log In Sign Up

Mixture of Experts Distributional Regression: Implementation Using Robust Estimation with Adaptive First-order Methods

by   David Rügamer, et al.
WU (Vienna University of Economics and Business)
Universität München

In this work, we propose an efficient implementation of mixtures of experts distributional regression models which exploits robust estimation by using stochastic first-order optimization techniques with adaptive learning rate schedulers. We take advantage of the flexibility and scalability of neural network software and implement the proposed framework in mixdistreg, an R software package that allows for the definition of mixtures of many different families, estimation in high-dimensional and large sample size settings and robust optimization based on TensorFlow. Numerical experiments with simulated and real-world data applications show that optimization is as reliable as estimation via classical approaches in many different settings and that results may be obtained for complicated scenarios where classical approaches consistently fail.


page 14

page 18


Neural Mixture Distributional Regression

We present neural mixture distributional regression (NMDR), a holistic f...

An l_1-oracle inequality for the Lasso in mixture-of-experts regression models

Mixture-of-experts (MoE) models are a popular framework for modeling het...

Families of Parsimonious Finite Mixtures of Regression Models

Finite mixtures of regression models offer a flexible framework for inve...

Imbalanced Mixed Linear Regression

We consider the problem of mixed linear regression (MLR), where each obs...

Distributed Learning of Finite Gaussian Mixtures

Advances in information technology have led to extremely large datasets ...

Calibrated Multivariate Regression with Application to Neural Semantic Basis Discovery

We propose a calibrated multivariate regression method named CMR for fit...