HEAT: Hyperedge Attention Networks

01/28/2022
by   Dobrik Georgiev, et al.
0

Learning from structured data is a core machine learning task. Commonly, such data is represented as graphs, which normally only consider (typed) binary relationships between pairs of nodes. This is a substantial limitation for many domains with highly-structured data. One important such domain is source code, where hypergraph-based representations can better capture the semantically rich and structured nature of code. In this work, we present HEAT, a neural model capable of representing typed and qualified hypergraphs, where each hyperedge explicitly qualifies how participating nodes contribute. It can be viewed as a generalization of both message passing neural networks and Transformers. We evaluate HEAT on knowledge base completion and on bug detection and repair using a novel hypergraph representation of programs. In both settings, it outperforms strong baselines, indicating its power and generality.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/31/2022

Message Passing Neural Networks for Hypergraphs

Hypergraph representations are both more efficient and better suited to ...
research
08/22/2022

Equivariant Hypergraph Neural Networks

Many problems in computer vision and machine learning can be cast as lea...
research
06/09/2020

Automatic Code Summarization via Multi-dimensional Semantic Fusing in GNN

Source code summarization aims to generate natural language summaries fr...
research
05/22/2018

Generative Code Modeling with Graphs

Generative models for source code are an interesting structured predicti...
research
10/13/2022

SHINE: SubHypergraph Inductive Neural nEtwork

Hypergraph neural networks can model multi-way connections among nodes o...
research
06/14/2022

Exploring Representation of Horn Clauses using GNNs (technique report)

Learning program semantics from raw source code is challenging due to th...

Please sign up or login with your details

Forgot password? Click here to reset