Communication-Efficient and Decentralized Multi-Task Boosting while Learning the Collaboration Graph
We study the decentralized machine learning scenario where many users collaborate to learn personalized models based on (i) their local datasets and (ii) a similarity graph over the users' learning tasks. Our approach trains nonlinear classifiers in a multi-task boosting manner without exchanging personal data and with low communication costs. When background knowledge about task similarities is not available, we propose to jointly learn the personalized models and a sparse collaboration graph through an alternating optimization procedure. We analyze the convergence rate, memory consumption and communication complexity of our decentralized algorithms, and demonstrate the benefits of our approach compared to competing techniques on synthetic and real datasets.
READ FULL TEXT