A modular architecture for creating multimodal agents

06/01/2022
by   Thomas Baier, et al.
0

The paper describes a flexible and modular platform to create multimodal interactive agents. The platform operates through an event-bus on which signals and interpretations are posted in a sequence in time. Different sensors and interpretation components can be integrated by defining their input and output as topics, which results in a logical workflow for further interpretations. We explain a broad range of components that have been developed so far and integrated into a range of interactive agents. We also explain how the actual interaction is recorded as multimodal data as well as in a so-called episodic Knowledge Graph. By analysing the recorded interaction, we can analyse and compare different agents and agent components.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/29/2017

HoME: a Household Multimodal Environment

We introduce HoME: a Household Multimodal Environment for artificial age...
research
10/04/2020

Creating Contexts of Creativity: Musical Composition with Modular Components

This paper describes a series of projects that explore the possibilities...
research
05/26/2022

Evaluating Multimodal Interactive Agents

Creating agents that can interact naturally with humans is a common goal...
research
10/24/2017

JADE - A Platform for Research on Cooperation of Physical and Virtual Agents

In the ICS, WUT a platform for simulation of cooperation of physical and...
research
06/08/2022

Decentralized, not Dehumanized in the Metaverse: Bringing Utility to NFTs through Multimodal Interaction

User Interaction for NFTs (Non-fungible Tokens) is gaining increasing at...
research
06/11/2021

Generalized Moving Peaks Benchmark

This document describes the Generalized Moving Peaks Benchmark (GMPB) th...

Please sign up or login with your details

Forgot password? Click here to reset