-
Transformation Forests
Regression models for supervised learning problems with a continuous tar...
read it
-
Ordinal Neural Network Transformation Models: Deep and interpretable regression models for ordinal outcomes
Outcomes with a natural order commonly occur in prediction tasks and oft...
read it
-
Deep transformation models: Tackling complex regression problems with neural network based transformation models
We present a deep transformation model for probabilistic regression. Dee...
read it
-
Non-Parametric Transformation Networks
ConvNets have been very effective in many applications where it is requi...
read it
-
DebiNet: Debiasing Linear Models with Nonlinear Overparameterized Neural Networks
Recent years have witnessed strong empirical performance of over-paramet...
read it
-
A Unifying Network Architecture for Semi-Structured Deep Distributional Learning
We propose a unifying network architecture for deep distributional learn...
read it
-
Top-down Transformation Choice
Simple models are preferred over complex models, but over-simplistic mod...
read it
Deep Conditional Transformation Models
Learning the cumulative distribution function (CDF) of an outcome variable conditional on a set of features remains challenging, especially in high-dimensional settings. Conditional transformation models provide a semi-parametric approach that allows to model a large class of conditional CDFs without an explicit parametric distribution assumption and with only a few parameters. Existing estimation approaches within the class of transformation models are, however, either limited in their complexity and applicability to unstructured data sources such as images or text, or can incorporate complex effects of different features but lack interpretability. We close this gap by introducing the class of deep conditional transformation models which unify existing approaches and allow to learn both interpretable (non-)linear model terms and more complex predictors in one holistic neural network. To this end we propose a novel network architecture, provide details on different model definitions and derive suitable constraints and derive suitable network regularization terms. We demonstrate the efficacy of our approach through numerical experiments and applications.
READ FULL TEXT
Comments
There are no comments yet.