Graph Mixer Networks

01/29/2023
by   Ahmet Sarıgün, et al.
0

In recent years, the attention mechanism has demonstrated superior performance in various tasks, leading to the emergence of GAT and Graph Transformer models that utilize this mechanism to extract relational information from graph-structured data. However, the high computational cost associated with the Transformer block, as seen in Vision Transformers, has motivated the development of alternative architectures such as MLP-Mixers, which have been shown to improve performance in image tasks while reducing the computational cost. Despite the effectiveness of Transformers in graph-based tasks, their computational efficiency remains a concern. The logic behind MLP-Mixers, which addresses this issue in image tasks, has the potential to be applied to graph-structured data as well. In this paper, we propose the Graph Mixer Network (GMN), also referred to as Graph Nasreddin Nets (GNasNets), a framework that incorporates the principles of MLP-Mixers for graph-structured data. Using a PNA model with multiple aggregators as the foundation, our proposed GMN has demonstrated improved performance compared to Graph Transformers. The source code is available publicly at https://github.com/asarigun/GraphMixerNetworks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/10/2023

Exphormer: Sparse Transformers for Graphs

Graph transformers have emerged as a promising architecture for a variet...
research
11/22/2021

DBIA: Data-free Backdoor Injection Attack against Transformer Networks

Recently, transformer architecture has demonstrated its significance in ...
research
08/15/2023

ICAFusion: Iterative Cross-Attention Guided Feature Fusion for Multispectral Object Detection

Effective feature fusion of multispectral images plays a crucial role in...
research
11/19/2021

Grounded Situation Recognition with Transformers

Grounded Situation Recognition (GSR) is the task that not only classifie...
research
12/09/2021

Locally Shifted Attention With Early Global Integration

Recent work has shown the potential of transformers for computer vision ...
research
02/17/2022

Transformer for Graphs: An Overview from Architecture Perspective

Recently, Transformer model, which has achieved great success in many ar...
research
12/13/2022

Bridging Graph Position Encodings for Transformers with Weighted Graph-Walking Automata

A current goal in the graph neural network literature is to enable trans...

Please sign up or login with your details

Forgot password? Click here to reset