An 'Unreal' Framework for Creating and Controlling Audio-Visual Scenes for the rtSOFE
* Presenting author
Abstract:
Highly realistic audio-visual environments have their application in various areas, including architecture, gaming, art, perceptual research, and beyond. The reproduction of complex scenes requires advanced and complex tools, for instance game engines, but these often lack advanced acoustical simulations. The real-time Simulated Open Field Environment (rtSOFE) is an interactive room simulator for rooms of arbitrary shapes and has been designed to meet criteria of precision, interactivity, and real-time performance. These are important for virtual reality. However, the rtSOFE has not been integrated fully with other virtual reality tools. Here we present a new framework that integrates an established tool for virtual reality (a popular game engine) with the rtSOFE to ease the production of complex environments combined with highly precise audio for hearing research. We created a game engine plugin to automatize the management of the distributed rtSOFE system. The plugin exploits existing game engine features, like the support for visual CAVE systems or the support for high-level programming languages. The performance of the framework is demonstrated with a simulation of a highly realistic classroom in terms of it’s visual and acoustic component and interaction with a movable source.