Rendering Spatial Sound for Interoperable Experiences in the Audio Metaverse

09/26/2021
by   Jean-Marc Jot, et al.
0

Interactive audio spatialization technology previously developed for video game authoring and rendering has evolved into an essential component of platforms enabling shared immersive virtual experiences for future co-presence, remote collaboration and entertainment applications. New wearable virtual and augmented reality displays employ real-time binaural audio computing engines rendering multiple digital objects and supporting the free navigation of networked participants or their avatars through a juxtaposition of environments, real and virtual, often referred to as the Metaverse. These applications require a parametric audio scene programming interface to facilitate the creation and deployment of shared, dynamic and realistic virtual 3D worlds on mobile computing platforms and remote servers. We propose a practical approach for designing parametric 6-degree-of-freedom object-based interactive audio engines to deliver the perceptually relevant binaural cues necessary for audio/visual and virtual/real congruence in Metaverse experiences. We address the effects of room reverberation, acoustic reflectors, and obstacles in both the virtual and real environments, and discuss how such effects may be driven by combinations of pre-computed and real-time acoustic propagation solvers. We envision an open scene description model distilled to facilitate the development of interoperable applications distributed across multiple platforms, where each audio object represents, to the user, a natural sound source having controllable distance, size, orientation, and acoustic radiation properties.

READ FULL TEXT

page 1

page 2

page 4

page 15

research
02/02/2023

Listen2Scene: Interactive material-aware binaural sound propagation for reconstructed 3D scenes

We present an end-to-end binaural audio rendering approach (Listen2Scene...
research
07/23/2020

Sound Field Translation and Mixed Source Model for Virtual Applications with Perceptual Validation

Non-interactive and linear experiences like cinema film offer high quali...
research
04/30/2018

A toolbox for rendering virtual acoustic environments in the context of audiology

A toolbox for creation and rendering of dynamic virtual acoustic environ...
research
08/23/2017

Object-Based Audio Rendering

Apparatus and methods are disclosed for performing object-based audio re...
research
08/09/2023

Sound propagation in realistic interactive 3D scenes with parameterized sources using deep neural operators

We address the challenge of sound propagation simulations in 3D virtual ...
research
07/24/2021

Dynamic Portal Occlusion for Precomputed Interactive Sound Propagation

An immersive audio-visual experience in games and virtual reality requir...
research
03/22/2011

Rendering of 3D Dynamic Virtual Environments

In this paper we present a framework for the rendering of dynamic 3D vir...

Please sign up or login with your details

Forgot password? Click here to reset