Semantically-Regularized Logic Graph Embeddings

09/03/2019
by   Yaqi Xie, et al.
0

In this work, we aim to utilize prior knowledge encoded as logical rules to improve the performance of deep models. We propose a logic graph embedding network that projects d-DNNF formulae (and assignments) onto a manifold via an augmented Graph Convolutional Network (GCN). To generate semantically-faithful embeddings, we propose techniques to recognize node heterogeneity, and semantic regularization that incorporate structural constraints into the embedding. Experiments show that our approach improves the performance of models trained to perform model-checking and visual relation prediction.

READ FULL TEXT
research
09/03/2019

Embedding Symbolic Knowledge into Deep Networks

In this work, we aim to leverage prior symbolic knowledge to improve the...
research
03/12/2018

Probabilistic and Regularized Graph Convolutional Networks

This paper explores the recently proposed Graph Convolutional Network ar...
research
11/11/2018

End-to-end Structure-Aware Convolutional Networks for Knowledge Base Completion

Knowledge graph embedding has been an active research topic for knowledg...
research
06/16/2021

Data Augmentation for Graph Convolutional Network on Semi-Supervised Classification

Data augmentation aims to generate new and synthetic features from the o...
research
08/27/2022

Spatial Relation Graph and Graph Convolutional Network for Object Goal Navigation

This paper describes a framework for the object-goal navigation task, wh...
research
02/24/2021

Graphfool: Targeted Label Adversarial Attack on Graph Embedding

Deep learning is effective in graph analysis. It is widely applied in ma...
research
03/13/2020

Graph Convolutional Topic Model for Data Streams

Learning hidden topics in data streams has been paid a great deal of att...

Please sign up or login with your details

Forgot password? Click here to reset