Using Variable Natural Environment Brain-Computer Interface Stimuli for Real-time Humanoid Robot Navigation

11/26/2018
by   Nik Khadijah Nik Aznan, et al.
0

This paper addresses the challenge of humanoid robot teleoperation in a natural indoor environment via a Brain-Computer Interface (BCI). We leverage deep Convolutional Neural Network (CNN) based image and signal understanding to facilitate both real-time object detection and dry-Electroencephalography (EEG) based human cortical brain bio-signal decoding. We employ recent advances in dry-EEG technology to stream and collect the cortical waveforms from subjects while the subjects fixate on variable Steady-State Visual Evoked Potential (SSVEP) stimuli generated directly from the environment the robot is navigating. To these ends, we propose the use of novel variable BCI stimuli by utilising the real-time video streamed via the on-board robot camera as visual input for SSVEP where the CNN detected natural scene objects are altered and flickered with differing frequencies (10Hz, 12Hz and 15Hz). These stimuli are not akin to traditional stimuli - as both the dimensions of the flicker regions and their on-screen position changes depending on the scene objects detected in the scene. On-screen object selection via dry-EEG enabled SSVEP in this way, facilitates the on-line decoding of human cortical brain signals via a secondary CNN approach into teleoperation robot commands (approach object, move in a specific direction: right, left or back). This SSVEP decoding model is trained via a priori offline experimental data in which very similar visual input is present for all subjects. The resulting offline classification demonstrates extremely high performance and with mean accuracies of 96 for the real-time robot navigation experiment across multiple test subjects.

READ FULL TEXT

page 1

page 3

page 4

page 6

research
09/01/2016

Deep Learning Human Mind for Automated Visual Classification

What if we could effectively read the mind and transfer human visual cap...
research
02/04/2022

Brain-Computer-Interface controlled robot via RaspberryPi and PiEEG

This paper presents Open-source software and a developed shield board fo...
research
02/04/2020

Decoding Visual Responses based on Deep Neural Networks with Ear-EEG Signals

Recently, practical brain-computer interface is actively carried out, es...
research
06/30/2016

Steering a Predator Robot using a Mixed Frame/Event-Driven Convolutional Neural Network

This paper describes the application of a Convolutional Neural Network (...
research
02/28/2018

A Neurorobotic Experiment for Crossmodal Conflict Resolution in Complex Environments

Crossmodal conflict resolution is a crucial component of robot sensorimo...
research
03/12/2018

Compact Convolutional Neural Networks for Classification of Asynchronous Steady-state Visual Evoked Potentials

Steady-State Visual Evoked Potentials (SSVEPs) are neural oscillations f...
research
02/18/2020

Deep Learning of Movement Intent and Reaction Time for EEG-informed Adaptation of Rehabilitation Robots

Mounting evidence suggests that adaptation is a crucial mechanism for re...

Please sign up or login with your details

Forgot password? Click here to reset