Electric Singing Amoebae from Outer Space
For the final show I worked with three other people to create "Singing Electric Amoebae from Outer Space," an interactive music installation. The description:
Step into the cosmic petri dish and become the last of a musical microbial race from the ancient future. Capture particles for percussive pings and stretch your flagella for melodic tones.
The piece was sort of a study for all of us - we wanted to create something that allowed each of us to learn about something we were interested in:
- I was interested in interactive music
- Vinni was interested in computer vision
- Milos was interested in blob tracking and openFrameworks
- James was interested in creating a full-body, gestural interaction
Because of this, it was less of a conceptual installation and more of a camel ("a horse designed by committee"). But that was fine.
I didn't do a great job of documentation as we built or during the show (something I'm learning should be a priority for every project), but here's a little snapshot:
We used a Kinect and a blob tracking algorithm to identify where people stood in the space and the width of their "wing span."
Because the kinect grabs 3D data, we were able to create a virtual camera that, even though the physical camera was in the corner of the space, captured images as if it was sitting directly on top of the play space.
As people danced or stretched their arms, their amoeba became larger, their musical octave range increased, and when their arms reaches a certain threshold they were able to play melodies by moving their arms up and down or in and out.
I didn't shoot any video during the actual show of people playing with the piece. I am kicking myself. But here's a little video from before the show, we were still setting up. You can sort of get a picture for how it worked:
- Update 8/5: Here's a video (still not great quality) from the show. You can also see our Singing Graffiti piece in the far corner: -
I wrote the interactive music software in Max 6. The music was all based on arpeggios within a certain scale. As each person joined the floor, they received a musical piece of their own. Their movements affected the structure and range of the arpeggios.
Here's a taste of how the music sounded. In the video I'll simulate people coming onto the screen by clicking the check boxes (like "blob1On") that turn on when a new person walks into the playspace.
We also had little floating particle bits that players could grab by running into them. You'll hear two sound effects that played when you gobbled up a particle bit. I'll also go over to a little slider to demonstrate the music people made as they stretched their arms and moved them up and down:
Striking balance between melodic subtlety and obvious/satisfying feedback for the players was a challenge. I'm learning that any interaction in a space like this has to be exaggerated. There are reams of studies on how to execute this sort of thing well. That's one thing we didn't spend a lot of time on in the School of Machines - we learned about how to conceptualize and how to build, but less about how to test and refine.
There was a photographer and videographer at the show, so I'm hoping someone captured video of people playing with the piece.
We also set up the Singing Graffiti on another wall. I think I should sell this to a karaoke bar.