
Attributed Graph Clustering via Adaptive Graph Convolution
Attributed graph clustering is challenging as it requires joint modellin...
read it

Variational Coembedding Learning for Attributed Network Clustering
Recent works for attributed network clustering utilize graph convolution...
read it

Selfsupervised Contrastive Attributed Graph Clustering
Attributed graph clustering, which learns node representation from node ...
read it

Attributed Graph Learning with 2D Graph Convolution
Graph convolutional neural networks have demonstrated promising performa...
read it

Effective and Scalable Clustering on Massive Attributed Graphs
Given a graph G where each node is associated with a set of attributes, ...
read it

SSFG: Stochastically Scaling Features and Gradients for Regularizing Graph Convolution Networks
Graph convolutional networks have been successfully applied in various g...
read it

A Generic Framework for Interesting Subspace Cluster Detection in Multiattributed Networks
Detection of interesting (e.g., coherent or anomalous) clusters has been...
read it
Smoothness Sensor: Adaptive SmoothnessTransition Graph Convolutions for Attributed Graph Clustering
Clustering techniques attempt to group objects with similar properties into a cluster. Clustering the nodes of an attributed graph, in which each node is associated with a set of feature attributes, has attracted significant attention. Graph convolutional networks (GCNs) represent an effective approach for integrating the two complementary factors of node attributes and structural information for attributed graph clustering. However, oversmoothing of GCNs produces indistinguishable representations of nodes, such that the nodes in a graph tend to be grouped into fewer clusters, and poses a challenge due to the resulting performance drop. In this study, we propose a smoothness sensor for attributed graph clustering based on adaptive smoothnesstransition graph convolutions, which senses the smoothness of a graph and adaptively terminates the current convolution once the smoothness is saturated to prevent oversmoothing. Furthermore, as an alternative to graphlevel smoothness, a novel finegained nodewise level assessment of smoothness is proposed, in which smoothness is computed in accordance with the neighborhood conditions of a given node at a certain order of graph convolution. In addition, a selfsupervision criterion is designed considering both the tightness within clusters and the separation between clusters to guide the whole neural network training process. Experiments show that the proposed methods significantly outperform 12 other stateoftheart baselines in terms of three different metrics across four benchmark datasets. In addition, an extensive study reveals the reasons for their effectiveness and efficiency.
READ FULL TEXT
Comments
There are no comments yet.