Emergence of synchrony in an Adaptive Interaction Model

06/18/2015
by   Kevin Sanlaville, et al.
0

In a Human-Computer Interaction context, we aim to elaborate an adaptive and generic interaction model in two different use cases: Embodied Conversational Agents and Creative Musical Agents for musical improvisation. To reach this goal, we'll try to use the concepts of adaptation and synchronization to enhance the interactive abilities of our agents and guide the development of our interaction model, and will try to make synchrony emerge from non-verbal dimensions of interaction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/20/2021

Towards Teachable Conversational Agents

The traditional process of building interactive machine learning systems...
research
05/26/2020

Is your chatbot GDPR compliant? Open issues in agent design

Conversational agents open the world to new opportunities for human inte...
research
12/06/2021

Modeling synchronization in human musical rhythms using Impulse Pattern Formulation (IPF)

When musicians perform in an ensemble, synchronizing to a mutual pace is...
research
01/31/2018

Deep Predictive Models in Interactive Music

Automatic music generation is a compelling task where much recent progre...
research
11/26/2020

Interactive Machine Learning of Musical Gesture

This chapter presents an overview of Interactive Machine Learning (IML) ...
research
05/03/2023

Where We Have Arrived in Proving the Emergence of Sparse Symbolic Concepts in AI Models

This paper aims to prove the emergence of symbolic concepts in well-trai...
research
03/08/2019

Talking about interaction*

Recent research has exposed disagreements over the nature and usefulness...

Please sign up or login with your details

Forgot password? Click here to reset