
Computational BayesPredictive Stochastic Programming: Finite Sample Bound
We study stochastic programming models where the stochastic variable is ...
read it

Computational BayesPredictive Stochastic Programming: Finite Sample Bounds
We study stochastic programming models where the stochastic variable is ...
read it

Measuring the reliability of MCMC inference with bidirectional Monte Carlo
Markov chain Monte Carlo (MCMC) is one of the main workhorses of probabi...
read it

Markov Chain Importance Sampling  a highly efficient estimator for MCMC
Markov chain algorithms are ubiquitous in machine learning and statistic...
read it

FISGAN: GAN with Flowbased Importance Sampling
Generative Adversarial Networks (GAN) training process, in most cases, a...
read it

Quantum Enhanced Inference in Markov Logic Networks
Markov logic networks (MLNs) reconcile two opposing schools in machine l...
read it

Learning Probabilistic Logic Programs in Continuous Domains
The field of statistical relational learning aims at unifying logic and ...
read it
Stochastic Logic Programs: Sampling, Inference and Applications
Algorithms for exact and approximate inference in stochastic logic programs (SLPs) are presented, based respectively, on variable elimination and importance sampling. We then show how SLPs can be used to represent prior distributions for machine learning, using (i) logic programs and (ii) Bayes net structures as examples. Drawing on existing work in statistics, we apply the MetropolisHasting algorithm to construct a Markov chain which samples from the posterior distribution. A Prolog implementation for this is described. We also discuss the possibility of constructing explicit representations of the posterior.
READ FULL TEXT
Comments
There are no comments yet.