Evidential Decision Theory via Partial Markov Categories

01/30/2023
by   Elena Di Lavore, et al.
0

We introduce partial Markov categories. In the same way that Markov categories encode stochastic processes, partial Markov categories encode stochastic processes with constraints, observations and updates. In particular, we prove a synthetic Bayes theorem; we use it to define a syntactic partial theory of observations on any Markov category, whose normalisations can be computed in the original Markov category. Finally, we formalise Evidential Decision Theory in terms of partial Markov categories, and provide implemented examples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/15/2022

A category-theoretic proof of the ergodic decomposition theorem

The ergodic decomposition theorem is a cornerstone result of dynamical s...
research
11/04/2022

Dilations and information flow axioms in categorical probability

We study the positivity and causality axioms for Markov categories as pr...
research
05/10/2020

Categorical Stochastic Processes and Likelihood

In this work we take a Category Theoretic perspective on the relationshi...
research
04/25/2023

Dynamic Tracing: a graphical language for rewriting protocols

The category Set_* of sets and partial functions is well-known to be tra...
research
04/14/2022

Probability monads with submonads of deterministic states - Extended version

Probability theory can be studied synthetically as the computational eff...
research
01/31/2020

Relational Semigroups and Object-Free Categories

This note relates axioms for partial semigroups and monoids with those f...
research
05/24/2023

Deep Stochastic Processes via Functional Markov Transition Operators

We introduce Markov Neural Processes (MNPs), a new class of Stochastic P...

Please sign up or login with your details

Forgot password? Click here to reset