Myosin Sensors were used to track accelerometer, gyroscope, and orientation data which was then fed via bluetooth to an iPad placed at each musician’s station. The data was then used to augment each musician’s artistic expression.
I developed a stand-alone iPad application that both displayed and triggered music for each musician and relayed their sensor data to the network.
My team and I designed and built an entirely new and versatile networked mapping and control software, Hyperproduction; this system is a node-based graphical programming language that can control, map, and network hundreds of inputs to hundreds of outputs across any number of devices and systems.
Led by Peter Torpey, I helped design and program the pixel mapping for the LED strips situated behind each musician. This mapping was then fed into Render Designer, a custom piece of software that allows any visual medium to be mapped across a pixel array.
Each musician performed on their own platform, fixed with two speakers, iPad, network connection, and two LED strips that would change color, intensity, and motion in response to their playing.