NIME 2020 Online Edition
This site is a contribution to the NIME 2020 conference, which is held as an online event due to the Covid-19 situation. The relevant session on 23 July, 11:30-12:30 (UTC+1) can be reached following this link: https://nime2020.bcu.ac.uk/posters-and-demos-23rd/
The related NIME paper:
von Coler, Henrik and Lepa, Steffen and Weinzierl, Stefan. (2020). User-Defined Mappings for Spatial Sound Synthesis. In In Proceedings of the International Conference on New Interfaces for Musical Expression (NIME). Birmingham, UK (Online)
Controlling Spatial Sound Synthesis
The system presented in this thesis allows the individual spatialization of spectral components in real-time, using a sinusoidal modeling approach within a 3-dimensional sound reproduction system. It comes with a co-developed, dedicated haptic interface to jointly control spectral and spatial attributes of the sound.
Having this system at hand, a key question in the design of digital musical instrument arises:
How do we connect the interface with the sound production in an enjoyable expressive way?
Sound Synthesis System
Spectro-Spatial Sound Synthesis allows the dynamic spatial distribution of spectral components in a real-time synthesis system. Based on Statistical Spectral Modeling, the GLOOO synthesizer is used for multichannel spectral modeling. For this context it should be sufficient to say it performs additive synthesis of violin sounds, based on prior analyses. It is also important to mention that the additive synthesizer is able to send each harmonic to an individual input of a spatial rendering system.
The software is in detail explained in related publications and the source code documentation: Details on the GLOOO synthesis software
For spatial rendering the system relies on 3rd party solutions, such as IRCAM's Panoramix, which are interchangeable. In a point-source-based approach, the spectral components can be placed in the virtual acoustic space, allowing the generation of a sound with spatial extent: Details on the spatialization.
The hand-held haptical interface has been developed for the use with the spatial sound synthesis. It is based on multiple force-sensing resistors and an IMU for capturing the orientation in space: Details on the interface.
Many interesting and unique methods for sound synthesis exist, as do interfaces and input devices for expressive control. However, their interconnection remains an active field of study. Mapping is thus a crucial part in the design of digital musical instruments. Especially in the case of spetro-spatial sound synthesis, a multitude of parameters needs to be controlled, demanding a thorough investigation. The mapping process is thus part of the development and evaluation of this project, explained here in detail: Details on the mapping stage.
The setup used for this instrument and the related user study includes the synthesis engine, the spatial renderer, the interface with the mapping and the sound reproduction system, a 21 channel hemispheric Ambisonics setup: Details on the setup.
With the setup presented above we tried to find at least partial answers to the initial questions. Therefor we conducted a user study, allowing participants to create an individual mapping between control parameters of the interface and rendering parameters of sound synthesis and spatialization: Details on the user study.