Visitors entering the space get tracked by infrared depth cameras. The bodies are transformed into contours projected as white lines together with the architecture.
Tracking data from the depth camera is used to generate midi values. Users can trigger sounds by merging their contours with the virtual architecture. The movement of the user and its relation to certain architectural features gets translated into sounds.
The augmented environment itself reacts on audio input. Sounds captured by a microphone are added as noise to the architectural line structure.
Human interaction creates a feedback loop between visuals and audio, bringing the virtual representation of the space to life.
The installation can be extended to an augmented audiovisual dance performance. Dancers explore the possibilities of being part of the projection and interacting with the architecture.
What is needed
power source video projector, min. resolution 1024x768 vga/dvi cable to projector tripod for projector/kinect mounting microphone, XLR cable tripod for microphone audio speakers stereo