For the production run with the Dallas Opera, the opera was simulcast to several other venues. To augment the remote viewer’s experience, a companion mobile application was developed that allowed for interactivity with the live show.
There were many complex technical systems involved in producing the opera, including sensors, speakers, consoles, robots, and lights.
The companion mobile application featured many interactive elements throughout the performance including video, randomized animations, pluckable harp strings, interactive robot models and the ability to upload a facebook profile into ‘the system’.
The MIT Media Lab Team designed custom mapping, animation, and control software for the walls, chandelier, and audio systems.
In addition to the mobile applications (for iOS and Android), software was developed to monitor all the mobile connections, trigger specific and randomized events on all devices, and to manage globally distributed content.
All of the interactive mobile moments were predetermined, cached, and dynamically triggered. Many of the interactions were randomly triggered, resulting in a unique experience for every audience member.