Reducing ReLU Count for Privacy-Preserving CNN Speedup

01/28/2021
by   Inbar Helbitz, et al.
0

Privacy-Preserving Machine Learning algorithms must balance classification accuracy with data privacy. This can be done using a combination of cryptographic and machine learning tools such as Convolutional Neural Networks (CNN). CNNs typically consist of two types of operations: a convolutional or linear layer, followed by a non-linear function such as ReLU. Each of these types can be implemented efficiently using a different cryptographic tool. But these tools require different representations and switching between them is time-consuming and expensive. Recent research suggests that ReLU is responsible for most of the communication bandwidth. ReLU is usually applied at each pixel (or activation) location, which is quite expensive. We propose to share ReLU operations. Specifically, the ReLU decision of one activation can be used by others, and we explore different ways to group activations and different ways to determine the ReLU for such a group of activations. Experiments on several datasets reveal that we can cut the number of ReLU operations by up to three orders of magnitude and, as a result, cut the communication bandwidth by more than 50

READ FULL TEXT

page 3

page 5

page 6

research
04/22/2021

CryptGPU: Fast Privacy-Preserving Machine Learning on the GPU

We introduce CryptGPU, a system for privacy-preserving machine learning ...
research
08/10/2023

Optimizing Performance of Feedforward and Convolutional Neural Networks through Dynamic Activation Functions

Deep learning training training algorithms are a huge success in recent ...
research
12/11/2020

ALReLU: A different approach on Leaky ReLU activation function to improve Neural Networks Performance

Despite the unresolved 'dying ReLU problem', the classical ReLU activati...
research
05/30/2018

Multi-function Convolutional Neural Networks for Improving Image Classification Performance

Traditional Convolutional Neural Networks (CNNs) typically use the same ...
research
04/20/2023

Securing Neural Networks with Knapsack Optimization

Deep learning inference brings together the data and the Convolutional N...
research
08/20/2023

AutoReP: Automatic ReLU Replacement for Fast Private Network Inference

The growth of the Machine-Learning-As-A-Service (MLaaS) market has highl...
research
04/02/2018

Average Biased ReLU Based CNN Descriptor for Improved Face Retrieval

The convolutional neural networks (CNN) like AlexNet, GoogleNet, VGGNet,...

Please sign up or login with your details

Forgot password? Click here to reset