HE-MAN – Homomorphically Encrypted MAchine learning with oNnx models

02/16/2023
by   Martin Nocker, et al.
0

Machine learning (ML) algorithms are increasingly important for the success of products and services, especially considering the growing amount and availability of data. This also holds for areas handling sensitive data, e.g. applications processing medical data or facial images. However, people are reluctant to pass their personal sensitive data to a ML service provider. At the same time, service providers have a strong interest in protecting their intellectual property and therefore refrain from publicly sharing their ML model. Fully homomorphic encryption (FHE) is a promising technique to enable individuals using ML services without giving up privacy and protecting the ML model of service providers at the same time. Despite steady improvements, FHE is still hardly integrated in today's ML applications. We introduce HE-MAN, an open-source two-party machine learning toolset for privacy preserving inference with ONNX models and homomorphically encrypted data. Both the model and the input data do not have to be disclosed. HE-MAN abstracts cryptographic details away from the users, thus expertise in FHE is not required for either party. HE-MAN 's security relies on its underlying FHE schemes. For now, we integrate two different homomorphic encryption schemes, namely Concrete and TenSEAL. Compared to prior work, HE-MAN supports a broad range of ML models in ONNX format out of the box without sacrificing accuracy. We evaluate the performance of our implementation on different network architectures classifying handwritten digits and performing face recognition and report accuracy and latency of the homomorphically encrypted inference. Cryptographic parameters are automatically derived by the tools. We show that the accuracy of HE-MAN is on par with models using plaintext input while inference latency is several orders of magnitude higher compared to the plaintext case.

READ FULL TEXT
research
11/09/2020

Privacy-Preserving XGBoost Inference

Although machine learning (ML) is widely used for predictive tasks, ther...
research
01/30/2021

Efficient CNN Building Blocks for Encrypted Data

Machine learning on encrypted data can address the concerns related to p...
research
10/17/2022

Scaling up Trustless DNN Inference with Zero-Knowledge Proofs

As ML models have increased in capabilities and accuracy, so has the com...
research
10/27/2022

Partially Oblivious Neural Network Inference

Oblivious inference is the task of outsourcing a ML model, like neural-n...
research
12/11/2022

ezDPS: An Efficient and Zero-Knowledge Machine Learning Inference Pipeline

Machine Learning as a service (MLaaS) permits resource-limited clients t...
research
02/05/2021

Machine Learning in Precision Medicine to Preserve Privacy via Encryption

Precision medicine is an emerging approach for disease treatment and pre...
research
02/13/2023

Privacy-Preserving Tree-Based Inference with Fully Homomorphic Encryption

Privacy enhancing technologies (PETs) have been proposed as a way to pro...

Please sign up or login with your details

Forgot password? Click here to reset