Back to the top

Sensors & Experimental Realtime Control Mechanisms

  • Lecture
  • 3892 Views
  • Likes
Sensors & Experimental Realtime Control Mechanisms
This hour-long talk dives into different methods for controlling generative sound and visuals for live/real-time performance and interactive experiences. I will show in-depth examples of full dome performances I've created using Mi.Mu gloves to control the audio and visual elements live, as well as interactions and live control using the Leap Motion, Kinect, Muse EEG Headset, Vive, and other devices and sensors that can be used for live/real-time control. During the lecture, we will also compare different sensors and explore situations when they work best, and explore creative ways these sensors can be combined. I will share personal insight from my experience working with a range of different control mechanisms for both audio and visual creation and manipulation, for live performance as well as interactive audiovisual installations.

https://www.synthestruct.com/

Duration (minutes)

60

What is needed

Nothing

What the artists brings

Nothing

  • Lecture

Authors

Synthestruct
Synthestruct

United States Winter Park

Events