User-Defined Mappings for Spatial Sound Synthesis

NIME 2020 Online Edition

This site is a contribution to the NIME 2020 conference, which is held as an online event due to the Covid-19 situation. Contents will be activated at the beginning of the conference.


Abstract

The presented sound synthesis system allows the individual spatialization of spectral components in real-time, using a sinusoidal modeling approach within a 3-dimensional sound reproduction system. A co-developed, dedicated haptic interface is used to jointly control spectral and spatial attributes of the sound. Within a user study, participants were asked to create an individual mapping between control parameters of the interface and rendering parameters of sound synthesis and spatialization, using a visual programming environment. Resulting mappings of all participants are evaluated, indicating the preference of single control parameters for specific tasks. In comparison with mappings intended by the development team, the results validate certain design decisions and indicate necessary changes.

Sound Synthesis System

Interface

Setup

User Study



Contents © Henrik von Coler 2020 - Contact