“Sensitive Space” is a prototype for the space which goes beyond the imagination, and it is build utterly out of information: light, visual image and audio. Nothing haptic. A meta-machine is tracking the position of the visitors within the space and triggers the composition fragments thereafter.

“Sensitive space” is questioning the performative space and space for performance, dismantling the traditional notion of rather static relationship between the performer and the audience. It describes a new way how the collective perception can influence the performance itself and introduces the paradigm “form follows feedback”. In order to process the Information, as a building block of “Sensitive Space”, and to optimize the performance of “on-demand” spaces an advanced immersive interactive environment is needed.

“Sensitive Space” is an audio-visual environment build upon the interaction among the users. It has to be distinguished that here human-human interaction is important, and not, as we are normally used in technological environments, human-computer interaction. The space is sensitive to the behavior of the users (agents) and the events are triggered thereafter.
An infrared camera is tracking the users’ movement and position in space. This data is then send over to an Voronoi-based algorithm which calculates the distance between agents, spatializes the ambisonic sound and sends OSC data to the MAX/MSP patch for video projections on the walls and on the floor. A “meta-machine” triggers fragments of the generative composition in case the agents are approximately close to each other. Otherwise, only a ground noise (background sound) can be heard. The current sound is FFT-analyzed and visualized in combination with computer animation. Through their position in space and their “nearness” the agents are creating performative space and the space for performance at the same time. The invisible technology in background provides the infrastructure for the immersive experience of the composition.

 

Sensitive Space Credits:

Concept & Space Design: ORTLOS – Ivan Redi, Andrea Redi, Gudrun Jöller, Marco Russo, Dragan Danicic

Curator: Charlotte Pöchhhacker

Composition Sketch: Hubert Machnik

Technical Realisation: KMKG Studio

Motion Tracking: CVL TU Wien & COGVIS

Ambisonics: Winfried Ritsch with IEM KUGraz students

Production Partner: Kunsthaus Mürz

Financial Support: bm:ukk & Universalmuseum Joanneum

Computer Support: PC Planet