Deep Class Incremental Learning from Decentralized Data

03/11/2022
by   Xiaohan Zhang, et al.
0

In this paper, we focus on a new and challenging decentralized machine learning paradigm in which there are continuous inflows of data to be addressed and the data are stored in multiple repositories. We initiate the study of data decentralized class-incremental learning (DCIL) by making the following contributions. Firstly, we formulate the DCIL problem and develop the experimental protocol. Secondly, we introduce a paradigm to create a basic decentralized counterpart of typical (centralized) class-incremental learning approaches, and as a result, establish a benchmark for the DCIL study. Thirdly, we further propose a Decentralized Composite knowledge Incremental Distillation framework (DCID) to transfer knowledge from historical models and multiple local sites to the general model continually. DCID consists of three main components namely local class-incremental learning, collaborated knowledge distillation among local models, and aggregated knowledge distillation from local models to the general one. We comprehensively investigate our DCID framework by using different implementations of the three components. Extensive experimental results demonstrate the effectiveness of our DCID framework. The codes of the baseline methods and the proposed DCIL will be released at https://github.com/zxxxxh/DCIL.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset