Propositional Knowledge Representation in Restricted Boltzmann Machines

05/31/2017
by   Son N. Tran, et al.
0

Representing symbolic knowledge into a connectionist network is the key element for the integration of scalable learning and sound reasoning. Most of the previous studies focus on discriminative neural networks which unnecessarily require a separation of input/output variables. Recent development of generative neural networks such as restricted Boltzmann machines (RBMs) has shown a capability of learning semantic abstractions directly from data, posing a promise for general symbolic learning and reasoning. Previous work on Penalty logic show a link between propositional logic and symmetric connectionist networks, however it is not applicable to RBMs. This paper proposes a novel method to represent propositional formulas into RBMs/stack of RBMs where Gibbs sampling can be seen as maximising satisfiability. It also shows a promising use of RBMs to learn symbolic knowledge through maximum likelihood estimation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/10/2021

Logical Boltzmann Machines

The idea of representing symbolic knowledge in connectionist systems has...
research
06/06/2017

Unsupervised Neural-Symbolic Integration

Symbolic has been long considered as a language of human intelligence wh...
research
01/20/2023

Generative Logic with Time: Beyond Logical Consistency and Statistical Possibility

This paper gives a theory of inference to logically reason symbolic know...
research
06/11/2023

Resolution for Constrained Pseudo-Propositional Logic

This work, shows how propositional resolution can be generalized to obta...
research
04/11/2016

Symbolic Knowledge Extraction using Łukasiewicz Logics

This work describes a methodology that combines logic-based systems and ...
research
10/01/2014

Deep Tempering

Restricted Boltzmann Machines (RBMs) are one of the fundamental building...

Please sign up or login with your details

Forgot password? Click here to reset