Co-Opera presented a series of short, experimental operas for one night only on April 12th, 2015:
CO-OPERA is a collaborative opera production involving Carnegie Mellon University students from various parts of the university as well as community partner, Pittsburgh Opera. Students, alumni, and faculty from the School of Music, School of Art, School of Drama, and Master of Arts Management Program are working together to present a production on the cutting edge of opera, at the intersection of arts and technology.
I built a system to perform live visuals using only simple shapes and my hands for a new short opera, The Elephants:
The Elephants (A Modern Fable)
(Xiao Liang, librettists Savannah Reich and Jonah Eisenstock)
This opera is a short tale of a group of elephants' interaction with the humans who live nearby, and how modern advances have changed that relationship.
I approached this project with the conceit that, although I'm not an opera singer, I wanted to find a way to perform along with the musicians and singers. My initial idea was to build a visual system I could control using my hands as I could build a system to create a much more nuanced interaction in a short amount of time as compared to a purely procedural-based backdrop. The director desired abstraction over representational imagery of "elephants" and "humans", so a visual leitmotif was chosen: soft, round shapes represented the elephants while the humans were sharp and angular. This approach was influenced by the effective simplicity of Norton Juster's The Dot and the Line: A Romance in Lower Mathematics as animated by Chuck Jones and MGM:
The forms of the shapes in The Elephants comes from the chosen material, including foam chunks and stuffed spheres, which are sliced in 3 dimensional space into virtual cross sections drawn to the screen by an infrared depth-based vision system. This approach was useful for a number of reasons:
- cheap materials yield a large number of shapes
- dynamic movement and interaction are completely straightforward
- provides room for experimentation with materials and performance styles during rehearsals
- the infrared depth-based vision system works completely in the dark, perfect for live use from an upper balcony during the show
Since this was performed live, I was able to react to and modify the performance as it was underway and provide a more nuanced interaction in a shorter amount of time. The movement and interaction of the shapes complemented the movement on stage, reinforcing the action as well as providing a mood backdrop through contrasting color.
The system consists of custom software built using OpenFrameworks and an MS Kinect depth sensing camera. A custom live editing system allowed for quick customization for each scene during & after rehearsals.