Neural Consciousness Flow

05/30/2019
by   Xiaoran Xu, et al.
0

The ability of reasoning beyond data fitting is substantial to deep learning systems in order to make a leap forward towards artificial general intelligence. A lot of efforts have been made to model neural-based reasoning as an iterative decision-making process based on recurrent networks and reinforcement learning. Instead, inspired by the consciousness prior proposed by Yoshua Bengio, we explore reasoning with the notion of attentive awareness from a cognitive perspective, and formulate it in the form of attentive message passing on graphs, called neural consciousness flow (NeuCFlow). Aiming to bridge the gap between deep learning systems and reasoning, we propose an attentive computation framework with a three-layer architecture, which consists of an unconsciousness flow layer, a consciousness flow layer, and an attention flow layer. We implement the NeuCFlow model with graph neural networks (GNNs) and conditional transition matrices. Our attentive computation greatly reduces the complexity of vanilla GNN-based methods, capable of running on large-scale graphs. We validate our model for knowledge graph reasoning by solving a series of knowledge base completion (KBC) tasks. The experimental results show NeuCFlow significantly outperforms previous state-of-the-art KBC methods, including the embedding-based and the path-based. The reproducible code can be found by the link below.

READ FULL TEXT

page 3

page 10

page 19

research
09/25/2019

Dynamically Pruned Message Passing Networks for Large-Scale Knowledge Graph Reasoning

We propose Dynamically Pruned Message Passing Networks (DPMPN) for large...
research
05/17/2023

River of No Return: Graph Percolation Embeddings for Efficient Knowledge Graph Reasoning

We study Graph Neural Networks (GNNs)-based embedding techniques for kno...
research
07/14/2020

Attentive Graph Neural Networks for Few-Shot Learning

Graph Neural Networks (GNN) has demonstrated the superior performance in...
research
09/29/2020

Direct Multi-hop Attention based Graph Neural Network

Introducing self-attention mechanism in graph neural networks (GNNs) ach...
research
11/01/2018

Modeling Attention Flow on Graphs

Real-world scenarios demand reasoning about process, more than final out...
research
06/19/2020

Abstract Diagrammatic Reasoning with Multiplex Graph Networks

Abstract reasoning, particularly in the visual domain, is a complex huma...
research
05/10/2023

ANALOGYKB: Unlocking Analogical Reasoning of Language Models with A Million-scale Knowledge Base

Analogical reasoning is a fundamental cognitive ability of humans. Howev...

Please sign up or login with your details

Forgot password? Click here to reset