Learning Probabilistic Logic Programs in Continuous Domains

07/15/2018
by   Stefanie Speichert, et al.
0

The field of statistical relational learning aims at unifying logic and probability to reason and learn from data. Perhaps the most successful paradigm in the field is probabilistic logic programming: the enabling of stochastic primitives in logic programming, which is now increasingly seen to provide a declarative background to complex machine learning applications. While many systems offer inference capabilities, the more significant challenge is that of learning meaningful and interpretable symbolic representations from data. In that regard, inductive logic programming and related techniques have paved much of the way for the last few decades. Unfortunately, a major limitation of this exciting landscape is that much of the work is limited to finite-domain discrete probability distributions. Recently, a handful of systems have been extended to represent and perform inference with continuous distributions. The problem, of course, is that classical solutions for inference are either restricted to well-known parametric families (e.g., Gaussians) or resort to sampling strategies that provide correct answers only in the limit. When it comes to learning, moreover, inducing representations remains entirely open, other than "data-fitting" solutions that force-fit points to aforementioned parametric families. In this paper, we take the first steps towards inducing probabilistic logic programs for continuous and mixed discrete-continuous data, without being pigeon-holed to a fixed set of distribution families. Our key insight is to leverage techniques from piecewise polynomial function approximation theory, yielding a principled way to learn and compositionally construct density functions. We test the framework and discuss the learned representations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/08/2023

Neural Probabilistic Logic Programming in Discrete-Continuous Domains

Neural-symbolic AI (NeSy) allows neural networks to exploit symbolic bac...
research
11/12/2022

The generalised distribution semantics and projective families of distributions

We generalise the distribution semantics underpinning probabilistic logi...
research
02/17/2021

An asymptotic analysis of probabilistic logic programming with implications for expressing projective families of distributions

Over the last years, there has been increasing research on the scaling b...
research
01/23/2020

Learning Distributional Programs for Relational Autocompletion

Relational autocompletion is the problem of automatically filling out so...
research
01/15/2020

SMT + ILP

Inductive logic programming (ILP) has been a deeply influential paradigm...
research
04/04/2017

Deriving Probability Density Functions from Probabilistic Functional Programs

The probability density function of a probability distribution is a fund...
research
01/25/2018

Probabilistic Planning by Probabilistic Programming

Automated planning is a major topic of research in artificial intelligen...

Please sign up or login with your details

Forgot password? Click here to reset