Knowledge Extraction and Knowledge Integration governed by Łukasiewicz Logics

04/11/2016
by   Carlos Leandro, et al.
0

The development of machine learning in particular and artificial intelligent in general has been strongly conditioned by the lack of an appropriate interface layer between deduction, abduction and induction. In this work we extend traditional algebraic specification methods in this direction. Here we assume that such interface for AI emerges from an adequate Neural-Symbolic integration. This integration is made for universe of discourse described on a Topos governed by a many-valued Łukasiewicz logic. Sentences are integrated in a symbolic knowledge base describing the problem domain, codified using a graphic-based language, wherein every logic connective is defined by a neuron in an artificial network. This allows the integration of first-order formulas into a network architecture as background knowledge, and simplifies symbolic rule extraction from trained networks. For the train of such neural networks we changed the Levenderg-Marquardt algorithm, restricting the knowledge dissemination in the network structure using soft crystallization. This procedure reduces neural network plasticity without drastically damaging the learning performance, allowing the emergence of symbolic patterns. This makes the descriptive power of produced neural networks similar to the descriptive power of Łukasiewicz logic language, reducing the information lost on translation between symbolic and connectionist structures. We tested this method on the extraction of knowledge from specified structures. For it, we present the notion of fuzzy state automata, and we use automata behaviour to infer its structure. We use this type of automata on the generation of models for relations specified as symbolic background knowledge.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset