The Stiftung Planetarium Berlin organized this event, with Georgios Mavrikos taking care of production and the curation. We share a common vision for aesthetics and we want to open ways for artists to work in planetariums and spatial media in general.
In implementing the setup Timo Bittner re-measured and equalized the existing speaker array and adapted it to our ambisonics-based approach with our own audio processor (Q-Sys 510i Core), provided by Ambion GmbH.
Utilizing the IEM Plugin Suite from Institut für Elektronische Musik und Akustik at the Kunstuniversität Graz, the AIIRADecoder plugin was used to create the configuration of the speaker array and the SimpleDecoder plugin to decode the 16-channel third-order ambisonic sphere to the speaker array, consisting of 49 speakers in, and 8 subwoofers around the dome. Rather than upmixing the artist’s work we chose to use a complete 3D ambisonics workflow. The decoding, mixing, and rendering of each artist’s audio was done on a X2 Machine by XI Machines. All audio signal distribution from the artist to the speakers, via the X2 and Q-Sys Core, was handled by a Dante network.
We worked with each artist individually, educating them in the process of spatializing audio in 3D, and how to use the necessary tools. We chose Reaper as our digital audio workstation of choice because it handles up to 7th-order ambisonics. For spatialization we used the StereoEncoder and MultiEncoder by IEM. Andrew Rahman handled the spatialization of James Ginzburg’s music for the audiovisual piece Nimbes, as well as Jay Glass Dubs’ original composition, and supported PYUR in the process of spatializing her set.
To handle live performance we utilized Lemur running on iPads sending OSC messages to the X2 machine for the artists to spatialize some of their tracks live. Alexander Phleps created custom Lemur templates for Peter Kirn, Jay Glass Dubs, PYUR, and Music For Your Plants.
Additionally, we provided thorough documentation of the audio workflow and gave them aesthetic direction for working in spatial audio. We also showed them the complications of working in a dome.
We had a very diverse set of tasks working with the artists as each had very different technical requirements. To receive the live microphone input from Peter Kirn we connected the analog signal to a Yamaha QL1 mixer and routed it to the X2 via Dante where it was spatialized in real time.
James Ginzburg’s audio for Nimbes had to be synchronized on the frame level to the specially produced video. For this we used LTC Timecode coming from the planetarium’s video playback system and into Reaper to trigger the audio.
Music For Your Plants performed both live spatialization and pre-rendered ambisonic audio using his own computer which routed the 16-channel ambisonic sphere via Dante to the X2 machine which decoded it to the speaker array.
Despite having only two weeks to work in the planetarium, at night after it had closed, the show sold out and we received outstanding feedback from all levels. This is the first project in our collaboration with the planetarium and in the near future we will have workshops and a hackathon. At present all videos must be pre-rendered but we hope to crack open access to live visuals on the dome, and in late-February to March we plan to use the hackathon to make that possible.
We want to thank Tim Florian Horn, the director of the Stiftung Planetarium Berlin for allowing and supporting us to use our system approach. Also thanks to XI-Machines and the Ambion GmbH for their support. XI-Machines provided us with a X2-Machine for rendering realtime audio, and Ambion GmbH for providing us with a Q-Sys Core 510 and other infrastructure. Without their support this event would not have been possible on this quality level.
The Spatial Media Lab has the vision of creating easier access to, and better experiences with, spatial media at live events and home. For us it’s not just about creating 3D media without a vision, but sharing thought-provoking and interesting works of art that push the limits of technology.