Toward Extremely Lightweight Distracted Driver Recognition With Distillation-Based Neural Architecture Search and Knowledge Transfer

02/09/2023
by   Dichao Liu, et al.
0

The number of traffic accidents has been continuously increasing in recent years worldwide. Many accidents are caused by distracted drivers, who take their attention away from driving. Motivated by the success of Convolutional Neural Networks (CNNs) in computer vision, many researchers developed CNN-based algorithms to recognize distracted driving from a dashcam and warn the driver against unsafe behaviors. However, current models have too many parameters, which is unfeasible for vehicle-mounted computing. This work proposes a novel knowledge-distillation-based framework to solve this problem. The proposed framework first constructs a high-performance teacher network by progressively strengthening the robustness to illumination changes from shallow to deep layers of a CNN. Then, the teacher network is used to guide the architecture searching process of a student network through knowledge distillation. After that, we use the teacher network again to transfer knowledge to the student network by knowledge distillation. Experimental results on the Statefarm Distracted Driver Detection Dataset and AUC Distracted Driver Dataset show that the proposed approach is highly effective for recognizing distracted driving behaviors from photos: (1) the teacher network's accuracy surpasses the previous best accuracy; (2) the student network achieves very high accuracy with only 0.42M parameters (around 55 Furthermore, the student network architecture can be extended to a spatial-temporal 3D CNN for recognizing distracted driving from video clips. The 3D student network largely surpasses the previous best accuracy with only 2.03M parameters on the Drive Act Dataset. The source code is available at https://github.com/Dichao-Liu/Lightweight_Distracted_Driver_Recognition_with_Distillation-Based_NAS_and_Knowledge_Transfer.

READ FULL TEXT

page 1

page 2

page 6

page 7

page 9

research
06/02/2020

Channel Distillation: Channel-Wise Attention for Knowledge Distillation

Knowledge distillation is to transfer the knowledge from the data learne...
research
12/09/2020

Progressive Network Grafting for Few-Shot Knowledge Distillation

Knowledge distillation has demonstrated encouraging performances in deep...
research
02/26/2021

Knowledge Distillation Circumvents Nonlinearity for Optical Convolutional Neural Networks

In recent years, Convolutional Neural Networks (CNNs) have enabled ubiqu...
research
07/08/2020

Robust Re-Identification by Multiple Views Knowledge Distillation

To achieve robustness in Re-Identification, standard methods leverage tr...
research
03/24/2020

ShadowTutor: Distributed Partial Distillation for Mobile Video DNN Inference

Following the recent success of deep neural networks (DNN) on video comp...
research
03/30/2020

Squeezed Deep 6DoF Object Detection Using Knowledge Distillation

The detection of objects considering a 6DoF pose is common requisite to ...

Please sign up or login with your details

Forgot password? Click here to reset