DeepAI AI Chat
Log In Sign Up

Mixture of Experts Distributional Regression: Implementation Using Robust Estimation with Adaptive First-order Methods

11/17/2022
by   David Rügamer, et al.
WU (Vienna University of Economics and Business)
Universität München
0

In this work, we propose an efficient implementation of mixtures of experts distributional regression models which exploits robust estimation by using stochastic first-order optimization techniques with adaptive learning rate schedulers. We take advantage of the flexibility and scalability of neural network software and implement the proposed framework in mixdistreg, an R software package that allows for the definition of mixtures of many different families, estimation in high-dimensional and large sample size settings and robust optimization based on TensorFlow. Numerical experiments with simulated and real-world data applications show that optimization is as reliable as estimation via classical approaches in many different settings and that results may be obtained for complicated scenarios where classical approaches consistently fail.

READ FULL TEXT

page 14

page 18

10/14/2020

Neural Mixture Distributional Regression

We present neural mixture distributional regression (NMDR), a holistic f...
09/22/2020

An l_1-oracle inequality for the Lasso in mixture-of-experts regression models

Mixture-of-experts (MoE) models are a popular framework for modeling het...
12/02/2013

Families of Parsimonious Finite Mixtures of Regression Models

Finite mixtures of regression models offer a flexible framework for inve...
01/29/2023

Imbalanced Mixed Linear Regression

We consider the problem of mixed linear regression (MLR), where each obs...
10/20/2020

Distributed Learning of Finite Gaussian Mixtures

Advances in information technology have led to extremely large datasets ...
05/10/2013

Calibrated Multivariate Regression with Application to Neural Semantic Basis Discovery

We propose a calibrated multivariate regression method named CMR for fit...