CryptoNite: Revealing the Pitfalls of End-to-End Private Inference at Scale

11/04/2021
by   Karthik Garimella, et al.
0

The privacy concerns of providing deep learning inference as a service have underscored the need for private inference (PI) protocols that protect users' data and the service provider's model using cryptographic methods. Recently proposed PI protocols have achieved significant reductions in PI latency by moving the computationally heavy homomorphic encryption (HE) parts to an offline/pre-compute phase. Paired with recent optimizations that tailor networks for PI, these protocols have achieved performance levels that are tantalizingly close to being practical. In this paper, we conduct a rigorous end-to-end characterization of PI protocols and optimization techniques and find that the current understanding of PI performance is overly optimistic. Specifically, we find that offline storage costs of garbled circuits (GC), a key cryptographic protocol used in PI, on user/client devices are prohibitively high and force much of the expensive offline HE computation to the online phase, resulting in a 10-1000× increase to PI latency. We propose a modified PI protocol that significantly reduces client-side storage costs for a small increase in online latency. Evaluated end-to-end, the modified protocol outperforms current protocols by reducing the mean PI latency by 4× for ResNet18 on TinyImageNet. We conclude with a discussion of several recently proposed PI optimizations in light of the findings and note many actually increase PI latency when evaluated from an end-to-end perspective.

READ FULL TEXT
research
07/14/2022

Characterizing and Optimizing End-to-End Systems for Private Inference

Increasing privacy concerns have given rise to Private Inference (PI). I...
research
09/28/2022

Faster Secure Comparisons with Offline Phase for Efficient Private Set Intersection

In a Private section intersection (PSI) protocol, Alice and Bob compute ...
research
06/03/2019

BAYHENN: Combining Bayesian Deep Learning and Homomorphic Encryption for Secure DNN Inference

Recently, deep learning as a service (DLaaS) has emerged as a promising ...
research
05/13/2022

Impala: Low-Latency, Communication-Efficient Private Deep Learning Inference

This paper proposes Impala, a new cryptographic protocol for private inf...
research
11/20/2019

Express: Lowering the Cost of Metadata-hiding Communication with Cryptographic Privacy

Existing systems for metadata-hiding messaging that provide cryptographi...
research
04/06/2023

Robust, privacy-preserving, transparent, and auditable on-device blocklisting

With the accelerated adoption of end-to-end encryption, there is an oppo...
research
06/15/2020

CryptoNAS: Private Inference on a ReLU Budget

Machine learning as a service has given raise to privacy concerns surrou...

Please sign up or login with your details

Forgot password? Click here to reset