-
AJILE Movement Prediction: Multimodal Deep Learning for Natural Human Neural Recordings and Video
Developing useful interfaces between brains and machines is a grand chal...
read it
-
Target-Specific Action Classification for Automated Assessment of Human Motor Behavior from Video
Objective monitoring and assessment of human motor behavior can improve ...
read it
-
A Study on the Extraction and Analysis of a Large Set of Eye Movement Features during Reading
This work presents a study on the extraction and analysis of a set of 10...
read it
-
Automatic localization and decoding of honeybee markers using deep convolutional neural networks
The honeybee is a fascinating model for the emergence of collective beha...
read it
-
Artificial Brain Based on Credible Neural Circuits in a Human Brain
Neurons are individually translated into simple gates to plan a brain ba...
read it
-
Deep learning tools for the measurement of animal behavior in neuroscience
Recent advances in computer vision have made accurate, fast and robust m...
read it
-
Making brain-machine interfaces robust to future neural variability
A major hurdle to clinical translation of brain-machine interfaces (BMIs...
read it
Towards naturalistic human neuroscience and neuroengineering: behavior mining in long-term video and neural recordings
Recent advances in brain recording technology and artificial intelligence are propelling a new paradigm in neuroscience beyond the traditional controlled experiment. Naturalistic neuroscience studies neural computations associated with spontaneous behaviors performed in unconstrained settings. Analyzing such unstructured data lacking a priori experimental design remains a significant challenge, especially when the data is multi-modal and long-term. Here we describe an automated approach for analyzing large (≈250 GB/subject) datasets of simultaneously recorded human electrocorticography (ECoG) and naturalistic behavior video data for 12 subjects. Our pipeline discovers and annotates thousands of instances of human upper-limb movement events in long-term (7–9 day) naturalistic behavior data using a combination of computer vision, discrete latent-variable modeling, and string pattern-matching. Analysis of the simultaneously recorded brain data uncovers neural signatures of movement that corroborate prior findings from traditional controlled experiments. We also prototype a decoder for a movement initiation detection task to demonstrate the efficacy of our pipeline as a source of training data for brain-computer interfacing applications. We plan to publish our curated dataset, which captures naturalistic neural and behavioral variability at a scale not previously available. We believe this data will enable further research on models of neural function and decoding that incorporate such naturalistic variability and perform more robustly in real-world settings.
READ FULL TEXT
Comments
There are no comments yet.