Generating music through movement
Interactive Experimental Design
Role: Designer & Engineer
Skills: User Testing, Prototyping, HTML/CSS, JavaScript
Collaborators: Dancers
Tldr: Discovering new applications for body tracking AI
Skills: User Testing, Prototyping, HTML/CSS, JavaScript
Collaborators: Dancers
Tldr: Discovering new applications for body tracking AI
How can dancers create music with their movements?
One of the hardest parts of a dancer’s creative process is finding the right song.
As a result, many dancers learn how to produce their own music to gain more creative control and freedom.
However, what if the music was another element of expression dancers could control for their art? What if dancers could create music with their movements?
As a result, many dancers learn how to produce their own music to gain more creative control and freedom.
However, what if the music was another element of expression dancers could control for their art? What if dancers could create music with their movements?
Design-ish Process
Research:
How can I track movement easily?
How does live music creation work?
How can I track movement easily?
How does live music creation work?
Define & Design:
How can I integrate my research findings to design a prototype?
How can I integrate my research findings to design a prototype?
Test:
Have a dancer use this prototype with a live audience in a performance.
Have a dancer use this prototype with a live audience in a performance.
How can we determine movement?
PoseNet uses computer vision to detect human figures in images/videos so that we can determine where a body point may be.
PoseNet run on TensorFlow.js, which means anyone (like me) can run this technology from a web browser.

I saw that Midi boards were a simple tool for live beat creation.
In this board, each button triggers an uploaded sound. Musicians will hit the buttons in different patterns to create live music.
Design call out:
This midi board leverages a grid which makes buttons easy to recall.
In this board, each button triggers an uploaded sound. Musicians will hit the buttons in different patterns to create live music.
Design call out:
This midi board leverages a grid which makes buttons easy to recall.
Combining PoseNet x Midi Board
I used a Kinect to take a video feed onto a html canvas of the dancer’s body.
I then used PoseNet to track a body part on the dancer

On the HTML canvas, I created a set of buttons on a grid similar to a Midi board.
I set up simple code to detect if that body part “hit” one of the created buttons on the screen, then the music will play.
I used Ableton Live and sample beats, tying them to different buttons.
I set up simple code to detect if that body part “hit” one of the created buttons on the screen, then the music will play.
I used Ableton Live and sample beats, tying them to different buttons.

Setting it up
After set up, came iteration time. In real time, I worked with the dancer to figure out where buttons should be placed.
The grid layout was easy for the dancer to recall.
We experimented with different body parts, but landed on her wrist to be the best.

Live performance
Results
It was clear the performance was compelling for the audience. Yet, the best part was each performance was unique - revealing how much more control the dancer had her on process.
In the future...
It would be nice to use a camera that could follow the dancer, removing the limitations on how much stage she could use.
In the future...
It would be nice to use a camera that could follow the dancer, removing the limitations on how much stage she could use.