GuardNN: Secure DNN Accelerator for Privacy-Preserving Deep Learning

08/26/2020
by   Weizhe Hua, et al.
0

This paper proposes GuardNN, a secure deep neural network (DNN) accelerator, which provides strong hardware-based protection for user data and model parameters even in an untrusted environment. GuardNN shows that the architecture and protection can be customized for a specific application to provide strong confidentiality and integrity protection with negligible overhead. The design of the GuardNN instruction set reduces the TCB to just the accelerator and enables confidentiality protection without the overhead of integrity protection. GuardNN also introduces a new application-specific memory protection scheme to minimize the overhead of memory encryption and integrity verification. The scheme shows that most of the off-chip meta-data in today's state-of-the-art memory protection can be removed by exploiting the known memory access patterns of a DNN accelerator. GuardNN is implemented as an FPGA prototype, which demonstrates effective protection with less than 2 performance overhead for inference over a variety of modern DNN models.

READ FULL TEXT
04/20/2020

MgX: Near-Zero Overhead Memory Protection with an Application to Secure DNN Acceleration

In this paper, we propose MgX, a near-zero overhead memory protection sc...
03/22/2022

NNReArch: A Tensor Program Scheduling Framework Against Neural Network Architecture Reverse Engineering

Architecture reverse engineering has become an emerging attack against d...
03/11/2018

The Secure Machine: Efficient Secure Execution On Untrusted Platforms

In this work we present the Secure Machine, SeM for short, a CPU archite...
05/31/2020

Cheetah: Optimizations and Methods for PrivacyPreserving Inference via Homomorphic Encryption

As the application of deep learning continues to grow, so does the amoun...
12/28/2020

IRO: Integrity and Reliability Enhanced Ring ORAM

Memory security and reliability are two of the major design concerns in ...
10/28/2018

ELSA: Efficient Long-Term Secure Storage of Large Datasets

An increasing amount of information today is generated, exchanged, and s...
09/08/2020

SGX-MR: Regulating Dataflows for Protecting Access Patterns of Data-Intensive SGX Applications

Intel SGX has been a popular trusted execution environment (TEE) for pro...