Transformer-Graph Neural Network with Global-Local Attention for Multimodal Rumour Detection with Knowledge Distillation
Misinformation spreading becomes a critical issue in online conversation. Detecting rumours is an important research topic in social media analysis. Most existing methods, based on Convolutional Neural Networks (CNNs) and Graph Neural Networks (GNNs), do not make use of the relationship between the global and local information of a conversation for detection. In this paper, we propose a Transformer-Graph Neural Network (TGNN), to fuse the local information with the global representation, through an attention mechanism. Then, we extend the proposed TGNN for multimodal rumour detection, by considering the latent relationship between the multimodal feature and node feature to form a more comprehensive graph representation. To verify the effectiveness of our proposed method for multimodal rumour detection, we extend the existing PHEME-2016, PHEME-2018, and Weibo data sets, by collecting available and relevant images for training the proposal framework. To improve the performance of single-modal rumour detection, i.e., based on text input only, a teacher-student framework is employed to distil the knowledge from the multimodal model to the single-modal model. Experimental results show that our proposed TGNN can achieve state-of-the-art performance and generalization ability evaluated on the PHEME-2016, PHEME-2018, and Weibo data sets.
READ FULL TEXT