-
Neural Architecture Search over Decentralized Data
To preserve user privacy while enabling mobile intelligence, techniques ...
read it
-
FedNAS: Federated Deep Learning via Neural Architecture Search
Federated Learning (FL) has been proved to be an effective learning fram...
read it
-
Direct Federated Neural Architecture Search
Neural Architecture Search (NAS) is a collection of methods to craft the...
read it
-
Real-time Federated Evolutionary Neural Architecture Search
Federated learning is a distributed machine learning approach to privacy...
read it
-
From Federated Learning to Federated Neural Architecture Search: A Survey
Federated learning is a recently proposed distributed machine learning p...
read it
-
HW-NAS-Bench:Hardware-Aware Neural Architecture Search Benchmark
HardWare-aware Neural Architecture Search (HW-NAS) has recently gained t...
read it
-
Differentially-private Federated Neural Architecture Search
Neural architecture search, which aims to automatically search for archi...
read it
FDNAS: Improving Data Privacy and Model Diversity in AutoML
To prevent the leakage of private information while enabling automated machine intelligence, there is an emerging trend to integrate federated learning and Neural Architecture Search (NAS). Although promising as it may seem, the coupling of difficulties from both two tenets makes the algorithm development quite challenging. In particular, how to efficiently search the optimal neural architecture directly from massive non-iid data of clients in a federated manner remains to be a hard nut to crack. To tackle this challenge, in this paper, by leveraging the advances in proxy-less NAS, we propose a Federated Direct Neural Architecture Search (FDNAS) framework that allows hardware-aware NAS from decentralized non-iid data of clients. To further adapt for various data distributions of clients, inspired by meta-learning, a cluster Federated Direct Neural Architecture Search (CFDNAS) framework is proposed to achieve client-aware NAS, in the sense that each client can learn a tailored deep learning model for its particular data distribution. Extensive experiments on real-world non-iid datasets show state-of-the-art accuracy-efficiency trade-offs for various hardware and data distributions of clients. Our codes will be released publicly upon paper acceptance.
READ FULL TEXT
Comments
There are no comments yet.