Parameter Learning of Logic Programs for Symbolic-Statistical Modeling

06/09/2011
by   T. Sato, et al.
0

We propose a logical/mathematical framework for statistical parameter learning of parameterized logic programs, i.e. definite clause programs containing probabilistic facts with a parameterized distribution. It extends the traditional least Herbrand model semantics in logic programming to distribution semantics, possible world semantics with a probability distribution which is unconditionally applicable to arbitrary logic programs including ones for HMMs, PCFGs and Bayesian networks. We also propose a new EM algorithm, the graphical EM algorithm, that runs for a class of parameterized logic programs representing sequential decision processes where each decision is exclusive and independent. It runs on a new data structure called support graphs describing the logical relationship between observations and their explanations, and learns parameters by computing inside and outside probability generalized for logic programs. The complexity analysis shows that when combined with OLDT search for all explanations for observations, the graphical EM algorithm, despite its generality, has the same time complexity as existing EM algorithms, i.e. the Baum-Welch algorithm for HMMs, the Inside-Outside algorithm for PCFGs, and the one for singly connected Bayesian networks that have been developed independently in each research field. Learning experiments with PCFGs using two corpora of moderate size indicate that the graphical EM algorithm can significantly outperform the Inside-Outside algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/23/2021

DeepStochLog: Neural Stochastic Logic Programming

Recent advances in neural symbolic learning, such as DeepProbLog, extend...
research
02/06/2013

Update Rules for Parameter Estimation in Bayesian Networks

This paper re-examines the problem of parameter estimation in Bayesian n...
research
04/26/2013

Non Deterministic Logic Programs

Non deterministic applications arise in many domains, including, stochas...
research
03/19/2012

Parameter Learning in PRISM Programs with Continuous Random Variables

Probabilistic Logic Programming (PLP), exemplified by Sato and Kameya's ...
research
11/12/2022

The generalised distribution semantics and projective families of distributions

We generalise the distribution semantics underpinning probabilistic logi...
research
10/06/2022

Explanations as Programs in Probabilistic Logic Programming

The generation of comprehensible explanations is an essential feature of...
research
03/22/2013

Viterbi training in PRISM

VT (Viterbi training), or hard EM, is an efficient way of parameter lear...

Please sign up or login with your details

Forgot password? Click here to reset