Interactive Experimental Design


Generating music through movement



See how I leverage AI to create a new interactive product, which allows dancers to generate music through their movement

Watch at the 1:00 min mark.















How could dancers generate music through their movement?





While working on a project with dancers, I spent a lot of time learning about dance from a theoretical perspective. Many of the dancers I worked with were learning how to create their own music so they could have more creative control, especially around rhythm and speed

Rather than reacting to the music, I wondered what if the music was another element of expression dancers could control for their art.







Design Process



How can I track movement easily?
What emerging technologies can I leverage quickly?

How does live music creation work?
What are the live music mechanisms that would need to be integrated with movement?

How will I combine these?
How can integrate my research findings to design a prototype to be used during a performance?







How can we determine movement?



PoseNet uses computer vision to detect human figures in images/videos so that we can determine where a body point may be.

PoseNet run on TensorFlow.js, which means anyone (like me) can run this technology from a web browser.










How is live music created?



I saw that Midi boards were a simple tool for live beat creation.

In this board, each button triggers an uploaded sound. Musicians will hit the buttons in different patterns to create live music.

Design call out: This midi board leverages a grid which makes buttons easy to recall















Combining PoseNet x Midi Board



I used a Kinect to take a video feed onto an html canvas of the dancer’s body. I then used PoseNet to track the body part of the dancer.



On the HTML canvas, I created a set of buttons on a grid similar to a Midi board.

I set up simple code to detect if that body part “hit” one of the created buttons on the screen, then the music will play.

I used Ableton Live and sample beats, tying them to different buttons.







Setting it up



After set up, came iteration time. In real-time, I worked with the dancer to figure out where buttons should be placed.

The grid layout was easy for the dancer to recall.

We experimented with different body parts but landed on her wrist to be the best.