Odio is a 3D-Audio App by Volst.

Using headtracking and binaural decoding

 

 

The students were introduced to the concept of soundscape after Murray R. Schafer and Barry Turax with references to the World Soundscape Project from the late 1960s.

This concept was adapted and interpreted by Odio-App in 2021.


This App, developed by a small start-up of programmers and musicians, was awarded the Apple App Design Award in 2022.

The app utilises head-tracking built into the Apple AirPods, AirPods Max, Beats Studio Pro and virtual interactive sound sources with a visual interface for mobile phones and tablets.

 

As the newest development with the advent of on-board head-tracking in consumer headphones since 2021, it is now possible to adjust virtual sound source positions to the rotation of the head to the effect that we can compose and produce even more realistic or plausible soundscapes and composed scenes as the water stream, the reading voice, the fireplace, the coffee machine, or the piano stays in its position, in relation to the head.

While traditional stereo displays cause sounds to rotate along with the head movement, a new category of audio experiences called "binaural" and "head-tracked 3D-audio scenes" maintains the position of sound sources in virtual coordinates, creating a vivid sense of being in a lived space.

With Odio App binaural soundscapes are treated and distributed much like traditional tracks or songs, offering individually adjustable sound sources in a headphone-centric world. Through a simple touch and smooth finger-swipe on the circular representation of each sound source on the screen, listeners can seamlessly remix the aural architecture[1]. This interactive process engages with their auditory perception, granting them the power to reshape the soundscape according to their preferences. Seeing this technology as an artistic challenge and an instrument for world-building.



[1]For the concept of aural architecture I refer to Blesser, B., Salter, L., „Spaces Speak, Are You Listening?“ Experiencing Aural Architecture. Cambridge, Massachusetts: MIT Press, 2006.

 


Students from HfG exclusively received an entrance to the "artist's backend" to explore, research and try out spatial concepts and constellations they have learned so far using their own sounds and engage critically with what seems to be the state-of-the-art 3D audio consumer technology. The App was installed on iPhones and iPads, employing commercial head-tracking headphones that are increasingly used in everyday life.

 

How can sonic arts, composition, and design contribute differently to what is promoted at the moment by mayor tech-companies and in situ?

 

Students had the opportunity to publish their soundscapes via Odio-App and thus think about and test alternative modes of disseminating their work instead of Spotify or brands and labels.

 

The discussion topics were also the risks of mass product designing and the effects of everyday world-making.

Critical listening and experimental sound composition are key aspects of artistical exploration in these environments and can work as alternative productive strategies.

 

 

 

Soundscape

“is the the auditory equivalent of landscape, a sound or combination of sounds in an environment."


"It is a voice of a society and an environment. Our everyday activities animate the soundscape, but how and what we build is what amplifies or controls the sound."

"Sound can help add to an understanding of a place, which may not otherwise be visually identifiable.”

(Truax....)




 

“The distinguishable features of a soundscape are keynotes, sound signals and sound marks.


Soundmarks are those sounds considered culturally significant or deemed by an acoustic community to warrant preservation (such as church or temple bells, town square clocks, and foghorns)

while keynote sounds are those which are continuously operable within a site and form a background (traffic, for example, or air conditioner sounds or muzak).

Sound signals represent foreground sounds within a soundscape and thus may dynamically change and include local soundmarks.

The positions of the virtual sound sources are visually represented on the screen by objects, each depicted as circles bearing abbreviations of their respective indications. This visual mapping provides an intuitive means for the listener to engage with and navigate the multi-layered auditory environments. When a listener opens a soundscape, the arrangement of these objects appears as a pre-composed structure, forming an aural architecture that radiates concentrically from the idealized head position, represented at the centre of the screen. These objects are positioned at distinct directions and distances from this central point, creating a unique spatial texture. Despite this initial arrangement, the listener enjoys complete freedom to interact with the soundscape. They can swipe and reposition these sound sources, multiplying or erasing them based on their personal situations, preferences, and needs. The volume of each object diminishes as they are placed farther away from the head, contributing to a sense of spatial depth and distance. As objects overlap, they generate new and intricate spatial textures, blending together the original sources of the respective soundscape. This dynamic interaction allows the listener to actively shape and mould the auditory environment, creating a highly personalized and enveloping experience, tailored precisely to their desires and sensibilities.