Shimon the Rapper: A Real-Time System for Human-Robot Interactive Rap Battles

09/19/2020
by   Richard Savery, et al.
0

We present a system for real-time lyrical improvisation between a human and a robot in the style of hip hop. Our system takes vocal input from a human rapper, analyzes the semantic meaning, and generates a response that is rapped back by a robot over a musical groove. Previous work with real-time interactive music systems has largely focused on instrumental output, and vocal interactions with robots have been explored, but not in a musical context. Our generative system includes custom methods for censorship, voice, rhythm, rhyming and a novel deep learning pipeline based on phoneme embeddings. The rap performances are accompanied by synchronized robotic gestures and mouth movements. Key technical challenges that were overcome in the system are developing rhymes, performing with low-latency and dataset censorship. We evaluated several aspects of the system through a survey of videos and sample text output. Analysis of comments showed that the overall perception of the system was positive. The model trained on our hip hop dataset was rated significantly higher than our metal dataset in coherence, rhyme quality, and enjoyment. Participants preferred outputs generated by a given input phrase over outputs generated from unknown keywords, indicating that the system successfully relates its output to its input.

READ FULL TEXT

page 1

page 5

page 7

01/11/2020

Establishing Human-Robot Trust through Music-Driven Robotic Emotion Prosody and Gesture

As human-robot collaboration opportunities continue to expand, trust bec...
10/09/2020

Emotional Musical Prosody: Validated Vocal Dataset for Human Robot Interaction

Human collaboration with robotics is dependant on the development of a r...
09/18/2020

Emotional Musical Prosody for the Enhancement of Trust in Robotic Arm Communication

As robotic arms become prevalent in industry it is crucial to improve le...
11/20/2020

SophiaPop: Experiments in Human-AI Collaboration on Popular Music

A diverse team of engineers, artists, and algorithms, collaborated to cr...
03/01/2017

Learning Social Affordance Grammar from Videos: Transferring Human Interactions to Human-Robot Interactions

In this paper, we present a general framework for learning social afford...
10/04/2020

The Accordiatron: A MIDI Controller For Interactive Music

The Accordiatron is a new MIDI controller for real-time performance base...
07/05/2017

Creative Robot Dance with Variational Encoder

What we appreciate in dance is the ability of people to sponta- neously ...