Interactive Experimental Design
Generating music through movement
Role: Designer & Engineer
Skills: User Testing, Prototyping, HTML/CSS, JavaScript
Collaborators: Dancers
Tldr: Discovering new applications for body tracking AI
Skills: User Testing, Prototyping, HTML/CSS, JavaScript
Collaborators: Dancers
Tldr: Discovering new applications for body tracking AI
How can dancers create music with their movements?
While working on a project with dancers, I spent a lot of time learning about dance from a theoretical perspective. Many of the dancers I worked with were learning how to create their own music so they could have more creative control, especially around rhythm and speed
Rather than reacting to the music, I wondered what if the music was another element of expression dancers could control for their art.
Rather than reacting to the music, I wondered what if the music was another element of expression dancers could control for their art.
Design-ish Process
01
How can I track movement easily?
What emerging technologies can I leverage quickly?
How can I track movement easily?
What emerging technologies can I leverage quickly?
02
How does live music creation work?
What are the live music processes/tools that would need to be integrated with movement?
How does live music creation work?
What are the live music processes/tools that would need to be integrated with movement?
03
How will I combine these?
How can I integrate my research findings to design a prototype to be used during a performance?
How will I combine these?
How can I integrate my research findings to design a prototype to be used during a performance?
Research
PoseNet uses computer vision to detect human figures in images/videos so that we can determine where a body point may be.
PoseNet run on TensorFlow.js, which means anyone (like me) can run this technology from a web browser.
How can we determine movement?
PoseNet uses computer vision to detect human figures in images/videos so that we can determine where a body point may be.
PoseNet run on TensorFlow.js, which means anyone (like me) can run this technology from a web browser.
I saw that Midi boards were a simple tool for live beat creation.
In this board, each button triggers an uploaded sound. Musicians will hit the buttons in different patterns to create live music.
Design call out: This midi board leverages a grid which makes buttons easy to recall
In this board, each button triggers an uploaded sound. Musicians will hit the buttons in different patterns to create live music.
Design call out: This midi board leverages a grid which makes buttons easy to recall
Combining PoseNet x Midi Board
I used a Kinect to take a video feed onto a html canvas of the dancer’s body.
I then used PoseNet to track a body part on the dancer
On the HTML canvas, I created a set of buttons on a grid similar to a Midi board.
I set up simple code to detect if that body part “hit” one of the created buttons on the screen, then the music will play.
I used Ableton Live and sample beats, tying them to different buttons.
I set up simple code to detect if that body part “hit” one of the created buttons on the screen, then the music will play.
I used Ableton Live and sample beats, tying them to different buttons.
Setting it up
After set up, came iteration time. In real time, I worked with the dancer to figure out where buttons should be placed.
The grid layout was easy for the dancer to recall.
We experimented with different body parts, but landed on her wrist to be the best.
Test
Live Performance
Results
It was clear the performance was compelling for the audience. Yet, the best part was each performance was unique - revealing how much more control the dancer had her on process.
In the future...
It would be nice to use a camera that could follow the dancer, removing the limitations on how much stage she could use.
In the future...
It would be nice to use a camera that could follow the dancer, removing the limitations on how much stage she could use.