CryptoNN: Training Neural Networks over Encrypted Data

04/15/2019
by   Runhua Xu, et al.
0

Emerging neural networks based machine learning techniques such as deep learning and its variants have shown tremendous potential in many application domains. However, they raise serious privacy concerns due to the risk of leakage of highly privacy-sensitive data when data collected from users is used to train neural network models to support predictive tasks. To tackle such serious privacy concerns, several privacy-preserving approaches have been proposed in the literature that use either secure multi-party computation (SMC) or homomorphic encryption (HE) as the underlying mechanisms. However, neither of these cryptographic approaches provides an efficient solution towards constructing a privacy-preserving machine learning model, as well as supporting both the training and inference phases. To tackle the above issue, we propose a CryptoNN framework that supports training a neural network model over encrypted data by using the emerging functional encryption scheme instead of SMC or HE. We also construct a functional encryption scheme for basic arithmetic computation to support the requirement of the proposed CryptoNN framework. We present performance evaluation and security analysis of the underlying crypto scheme and show through our experiments that CryptoNN achieves accuracy that is similar to those of the baseline neural network models on the MNIST dataset.

READ FULL TEXT

page 1

page 7

research
11/12/2020

Revisiting Secure Computation Using Functional Encryption: Opportunities and Research Directions

Increasing incidents of security compromises and privacy leakage have ra...
research
12/18/2020

NN-EMD: Efficiently Training Neural Networks using Encrypted Multi-sourced Datasets

Training a machine learning model over an encrypted dataset is an existi...
research
04/29/2019

SEALion: a Framework for Neural Network Inference on Encrypted Data

We present SEALion: an extensible framework for privacy-preserving machi...
research
05/03/2023

Data Privacy with Homomorphic Encryption in Neural Networks Training and Inference

The use of Neural Networks (NNs) for sensitive data processing is becomi...
research
11/29/2018

MOBIUS: Model-Oblivious Binarized Neural Networks

A privacy-preserving framework in which a computational resource provide...
research
09/04/2020

Homomorphic-Encrypted Volume Rendering

Computationally demanding tasks are typically calculated in dedicated da...
research
12/27/2018

Low Latency Privacy Preserving Inference

When applying machine learning to sensitive data, one has to balance bet...

Please sign up or login with your details

Forgot password? Click here to reset