How should dancers interact with music?

Exploring how to use ml for non-surveillance purposes

Reimagining music + dance 

In this project, the dancer generates the music through her movement. While traditionally, live music is created by mixing with a midi keyboard. I wanted to experiment with a new form of interaction for music creation and the role of the dancer. Through body tracking and OpenCV, her different poses enable her to mix different beats and sounds-- allowing her movements to control the piece.

Behind the scenes

When starting this project, I sampled music and edited sound bites to create the main musical elements that the dancer would be mixing. I then created the visuals which I would VJ during her performance. In order to push myself all the visuals effects were created using JavaScript animation libraries. The visual images themselves were made on Illustrator.

In order to bring this project to live, I had to work very closely with my user: the dancer. It took a great deal of iterating on the functionality of my code to adjust to the dancer's needs and her process.

I also had to really build my language around technology so I could properly communicate what was happening "under the hood" and collaborate with her to make the appropriate adjustments.