Not Just Cloud Privacy: Protecting Client Privacy in Teacher-Student Learning

by   Lichao Sun, et al.

Ensuring the privacy of sensitive data used to train modern machine learning models is of paramount importance in many areas of practice. One recent popular approach to study these concerns is using the differential privacy via a "teacher-student" model, wherein the teacher provides the student with useful, but noisy, information, hopefully allowing the student model to perform well on a given task. However, these studies only solve the privacy concerns of the teacher by assuming the student owns a public but unlabelled dataset. In real life, the student also has privacy concerns on its unlabelled data, so as to inquire about privacy protection on any data sent to the teacher. In this work, we re-design the privacy-preserving "teacher-student" model consisting of adopting both private arbitrary masking and local differential privacy, which protects the sensitive information of each student sample. However, the traditional training of teacher model is not robust on any perturbed data. We use the adversarial learning techniques to improve the robustness of the perturbed sample that supports returning good feedback without having all private information of each student sample. The experimental results demonstrate the effectiveness of our new privacy-preserving "teacher-student" model.


Differentially Private Deep Learning with Smooth Sensitivity

Ensuring the privacy of sensitive data used to train modern machine lear...

Private Knowledge Transfer via Model Distillation with Generative Adversarial Networks

The deployment of deep learning applications has to address the growing ...

SEDML: Securely and Efficiently Harnessing Distributed Knowledge in Machine Learning

Training high-performing deep learning models require a rich amount of d...

Image-Hashing-Based Anomaly Detection for Privacy-Preserving Online Proctoring

Online proctoring has become a necessity in online teaching. Video-based...

Mitigating Unintended Memorization in Language Models via Alternating Teaching

Recent research has shown that language models have a tendency to memori...

On Sharing Models Instead of Data using Mimic learning for Smart Health Applications

Electronic health records (EHR) systems contain vast amounts of medical ...

An Ensemble Teacher-Student Learning Approach with Poisson Sub-sampling to Differential Privacy Preserving Speech Recognition

We propose an ensemble learning framework with Poisson sub-sampling to e...

Please sign up or login with your details

Forgot password? Click here to reset