RNAS-CL: Robust Neural Architecture Search by Cross-Layer Knowledge Distillation

01/19/2023
by   Utkarsh Nath, et al.
0

Deep Neural Networks are vulnerable to adversarial attacks. Neural Architecture Search (NAS), one of the driving tools of deep neural networks, demonstrates superior performance in prediction accuracy in various machine learning applications. However, it is unclear how it performs against adversarial attacks. Given the presence of a robust teacher, it would be interesting to investigate if NAS would produce robust neural architecture by inheriting robustness from the teacher. In this paper, we propose Robust Neural Architecture Search by Cross-Layer Knowledge Distillation (RNAS-CL), a novel NAS algorithm that improves the robustness of NAS by learning from a robust teacher through cross-layer knowledge distillation. Unlike previous knowledge distillation methods that encourage close student/teacher output only in the last layer, RNAS-CL automatically searches for the best teacher layer to supervise each student layer. Experimental result evidences the effectiveness of RNAS-CL and shows that RNAS-CL produces small and robust neural architecture.

READ FULL TEXT

page 4

page 11

page 13

page 14

research
11/29/2019

Towards Oracle Knowledge Distillation with Neural Architecture Search

We present a novel framework of knowledge distillation that is capable o...
research
06/12/2021

LE-NAS: Learning-based Ensenble with NAS for Dose Prediction

Radiation therapy treatment planning is a complex process, as the target...
research
02/11/2023

Improving Differentiable Architecture Search via Self-Distillation

Differentiable Architecture Search (DARTS) is a simple yet efficient Neu...
research
03/14/2019

Improving Neural Architecture Search Image Classifiers via Ensemble Learning

Finding the best neural network architecture requires significant time, ...
research
02/03/2023

Enhancing Once-For-All: A Study on Parallel Blocks, Skip Connections and Early Exits

The use of Neural Architecture Search (NAS) techniques to automate the d...
research
06/15/2020

Multi-fidelity Neural Architecture Search with Knowledge Distillation

Neural architecture search (NAS) targets at finding the optimal architec...
research
05/14/2023

Improving Defensive Distillation using Teacher Assistant

Adversarial attacks pose a significant threat to the security and safety...

Please sign up or login with your details

Forgot password? Click here to reset