Deep Neural Object Analysis by Interactive Auditory Exploration with a Humanoid Robot

07/03/2018
by   Manfred Eppe, et al.
2

We present a novel approach for interactive auditory object analysis with a humanoid robot. The robot elicits sensory information by physically shaking visually indistinguishable plastic capsules. It gathers the resulting audio signals from microphones that are embedded into the robotic ears. A neural network architecture learns from these signals to analyze properties of the contents of the containers. Specifically, we evaluate the material classification and weight prediction accuracy and demonstrate that the framework is fairly robust to acoustic real-world noise.

READ FULL TEXT

page 1

page 2

research
11/14/2019

Scene-Aware Audio Rendering via Deep Acoustic Analysis

We present a new method to capture the acoustic characteristics of real-...
research
11/13/2020

Enabling the Sense of Self in a Dual-Arm Robot

While humans are aware of their body and capabilities, robots are not. T...
research
12/22/2017

An Incremental Self-Organizing Architecture for Sensorimotor Learning and Prediction

During visuomotor tasks, robots have to compensate for the temporal dela...
research
05/24/2023

Interactive Neural Resonators

In this work, we propose a method for the controllable synthesis of real...
research
07/17/2020

Information Requirements of Collision-Based Micromanipulation

We present a task-centered formal analysis of the relative power of seve...
research
03/09/2016

Robot Dream

In this position paper we present a novel approach to neurobiologically ...

Please sign up or login with your details

Forgot password? Click here to reset