Today was the most advance work with Isadora to date. We have begun exploring external sensors as a way of influencing media elements and actors in the program. The two sensors we worked with were Sound and Video by using the internal microphone on our computer and a video camera attached to the system by Firewire.
Through the Sound Level Watcher we were able to create both Continuous Controllers and Triggers actuated by the volume of our voice. We also explored the Sound Frequency Watcher that let us specify a certain pitch of sound the actuate actions.
With our video cameras live capturing the space, we created a patch of four actors that turned the input into a camera sensor.
The Video In Watcher captured the live feed. That feed was converted into contrast by using Difference. The variances of the contrast (Dark vs Bright) were calculated by the Calculate Brightness Actor (which was calibrated to the space). The values generated by this actor were limited to a fixed range by a Limit Scale Value and Smother to create the variables that influence the media assets attached to the patch. With this in place, the intensity of a person's movement in front of the camera is read and converted into a variable that can influence a piece of media.
After learning the technology, we spent the afternoon developing a project that was based on a free write followed by an exploration of those words through movement - using our left-brains to create concept and our right brains to create the tools.
I was energized by this approach and dove back into my work with interesting ideas that kept me going until 12:30 AM.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment