
Understanding the (un)interpretability of natural image distributions using generative models
Probability density estimation is a classical and well studied problem, ...
read it

Utilities as Random Variables: Density Estimation and Structure Discovery
Decision theory does not traditionally include uncertainty over utility ...
read it

Density Estimation for Geolocation via Convolutional Mixture Density Network
Nowadays, geographic information related to Twitter is crucially importa...
read it

Bayesian Inference for Polycrystalline Materials
Polycrystalline materials, such as metals, are comprised of heterogeneou...
read it

Predicting the probability distribution of bus travel time to move towards reliable planning of public transport services
An important aspect of the quality of a public transport service is its ...
read it

Conditional Kernel Density Estimation Considering Autocorrelation for Renewable Energy Probabilistic Modeling
Renewable energy is essential for energy security and global warming mit...
read it

Approximations of conditional probability density functions in Lebesgue spaces via mixture of experts models
Mixture of experts (MoE) models are widely applied for conditional proba...
read it
Interpretable Mixture Density Estimation by use of Differentiable Treemodule
In order to develop reliable services using machine learning, it is important to understand the uncertainty of the model outputs. Often the probability distribution that the prediction target follows has a complex shape, and a mixture distribution is assumed as a distribution that uncertainty follows. Since the output of mixture density estimation is complicated, its interpretability becomes important when considering its use in real services. In this paper, we propose a method for mixture density estimation that utilizes an interpretable tree structure. Further, a fast inference procedure based on timeinvariant information cache achieves both high speed and interpretability.
READ FULL TEXT
Comments
There are no comments yet.