FedACK: Federated Adversarial Contrastive Knowledge Distillation for Cross-Lingual and Cross-Model Social Bot Detection

03/10/2023
by   Yingguang Yang, et al.
0

Social bot detection is of paramount importance to the resilience and security of online social platforms. The state-of-the-art detection models are siloed and have largely overlooked a variety of data characteristics from multiple cross-lingual platforms. Meanwhile, the heterogeneity of data distribution and model architecture makes it intricate to devise an efficient cross-platform and cross-model detection framework. In this paper, we propose FedACK, a new federated adversarial contrastive knowledge distillation framework for social bot detection. We devise a GAN-based federated knowledge distillation mechanism for efficiently transferring knowledge of data distribution among clients. In particular, a global generator is used to extract the knowledge of global data distribution and distill it into each client's local model. We leverage local discriminator to enable customized model design and use local generator for data enhancement with hard-to-decide samples. Local training is conducted as multi-stage adversarial and contrastive learning to enable consistent feature spaces among clients and to constrain the optimization direction of local models, reducing the divergences between local and global models. Experiments demonstrate that FedACK outperforms the state-of-the-art approaches in terms of accuracy, communication efficiency, and feature space consistency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/02/2022

FedDKD: Federated Learning with Decentralized Knowledge Distillation

The performance of federated learning in neural networks is generally in...
research
09/13/2022

Multi-stage Distillation Framework for Cross-Lingual Semantic Similarity Matching

Previous studies have proved that cross-lingual knowledge distillation c...
research
07/19/2022

FedX: Unsupervised Federated Learning with Cross Knowledge Distillation

This paper presents FedX, an unsupervised federated learning framework. ...
research
01/10/2022

FedDTG:Federated Data-Free Knowledge Distillation via Three-Player Generative Adversarial Networks

Applying knowledge distillation to personalized cross-silo federated lea...
research
01/01/2023

FedICT: Federated Multi-task Distillation for Multi-access Edge Computing

The growing interest in intelligent services and privacy protection for ...
research
10/14/2022

FedFM: Anchor-based Feature Matching for Data Heterogeneity in Federated Learning

One of the key challenges in federated learning (FL) is local data distr...
research
01/14/2023

FedSSC: Shared Supervised-Contrastive Federated Learning

Federated learning is widely used to perform decentralized training of a...

Please sign up or login with your details

Forgot password? Click here to reset