DeepAI AI Chat
Log In Sign Up

Higher-Order Attention Networks

by   Mustafa Hajij, et al.

This paper introduces higher-order attention networks (HOANs), a novel class of attention-based neural networks defined on a generalized higher-order domain called a combinatorial complex (CC). Similar to hypergraphs, CCs admit arbitrary set-like relations between a collection of abstract entities. Simultaneously, CCs permit the construction of hierarchical higher-order relations analogous to those supported by cell complexes. Thus, CCs effectively generalize both hypergraphs and cell complexes and combine their desirable characteristics. By exploiting the rich combinatorial nature of CCs, HOANs define a new class of message-passing attention-based networks that unifies higher-order neural networks. Our evaluation on tasks related to mesh shape analysis and graph learning demonstrates that HOANs attain competitive, and in some examples superior, predictive performance in comparison to state-of-the-art neural networks.


page 1

page 2

page 3

page 4


P-tensors: a General Formalism for Constructing Higher Order Message Passing Networks

Several recent papers have recently shown that higher order graph neural...

Transformers Generalize DeepSets and Can be Extended to Graphs and Hypergraphs

We present a generalization of Transformers to any-order permutation inv...

Simplicial Attention Networks

Graph representation learning methods have mostly been limited to the mo...

Neural Network Processing Neural Networks: An efficient way to learn higher order functions

Functions are rich in meaning and can be interpreted in a variety of way...

Cell Attention Networks

Since their introduction, graph attention networks achieved outstanding ...

Efficiently Finding Higher-Order Mutants

Higher-order mutation has the potential for improving major drawbacks of...