Bridging the Gap: Differentially Private Equivariant Deep Learning for Medical Image Analysis

09/09/2022
by   Florian A. Hölzl, et al.
0

Machine learning with formal privacy-preserving techniques like Differential Privacy (DP) allows one to derive valuable insights from sensitive medical imaging data while promising to protect patient privacy, but it usually comes at a sharp privacy-utility trade-off. In this work, we propose to use steerable equivariant convolutional networks for medical image analysis with DP. Their improved feature quality and parameter efficiency yield remarkable accuracy gains, narrowing the privacy-utility gap.

READ FULL TEXT
research
01/30/2023

Equivariant Differentially Private Deep Learning

The formal privacy guarantee provided by Differential Privacy (DP) bound...
research
06/24/2021

When Differential Privacy Meets Interpretability: A Case Study

Given the increase in the use of personal data for training Deep Neural ...
research
07/20/2021

Towards Privacy-preserving Explanations in Medical Image Analysis

The use of Deep Learning in the medical field is hindered by the lack of...
research
02/09/2020

Privacy-Preserving Image Classification in the Local Setting

Image data has been greatly produced by individuals and commercial vendo...
research
02/20/2023

Personalized and privacy-preserving federated heterogeneous medical image analysis with PPPML-HMI

Heterogeneous data is endemic due to the use of diverse models and setti...
research
12/20/2022

Local Differential Privacy Image Generation Using Flow-based Deep Generative Models

Diagnostic radiologists need artificial intelligence (AI) for medical im...
research
01/31/2019

AnomiGAN: Generative adversarial networks for anonymizing private medical data

Typical personal medical data contains sensitive information about indiv...

Please sign up or login with your details

Forgot password? Click here to reset