Log In Sign Up

Synthetic Distracted Driving (SynDD1) dataset for analyzing distracted behaviors and various gaze zones of a driver

by   Mohammed Shaiqur Rahman, et al.

This article presents a synthetic distracted driving (SynDD1) dataset for machine learning models to detect and analyze drivers' various distracted behavior and different gaze zones. We collected the data in a stationary vehicle using three in-vehicle cameras positioned at locations: on the dashboard, near the rearview mirror, and on the top right-side window corner. The dataset contains two activity types: distracted activities, and gaze zones for each participant and each activity type has two sets: without appearance blocks and with appearance blocks such as wearing a hat or sunglasses. The order and duration of each activity for each participant are random. In addition, the dataset contains manual annotations for each activity, having its start and end time annotated. Researchers could use this dataset to evaluate the performance of machine learning algorithms for the classification of various distracting activities and gaze zones of drivers.


page 5

page 6


Heatmap-Based Method for Estimating Drivers' Cognitive Distraction

In order to increase road safety, among the visual and manual distractio...

Dynamics of Driver's Gaze: Explorations in Behavior Modeling & Maneuver Prediction

The study and modeling of driver's gaze dynamics is important because, i...

(Safe) SMART Hands: Hand Activity Analysis and Distraction Alerts Using a Multi-Camera Framework

Manual (hand-related) activity is a significant source of crash risk whi...

Utilizing Eye Gaze to Enhance the Generalization of Imitation Networks to Unseen Environments

Vision-based autonomous driving through imitation learning mimics the be...

Safe Control Transitions: Machine Vision Based Observable Readiness Index and Data-Driven Takeover Time Prediction

To make safe transitions from autonomous to manual control, a vehicle mu...