Company Post

The Making-of Go Robot's Interactive Motion-Tracked Dance Installation

An interactive motion-tracking-based dance installation made with TouchDesigner during a 3-day intensive VR jam organized by Cinedans and Beamlab in Amsterdam, The Netherlands.

We were immediately intrigued with the above video and so caught up with the highly skilled and productive brothers Tim and Roy Gerritsen of Go Robot to get all the juicy details. They productively (and very kindly) provided us with the following very well-detailed words and images. Read on and enjoy, there's a ton to learn here! 

Within our newly formed studio (Go Robot.) we grabbed this opportunity with both hands. We love these kinds of events to play, experiment and develop new proof of concepts. A fun and really helpful way to lay a foundation for future client work. We wanted to create a framework to easily connect (GLSL shader) effects to tracked points in a virtual 3D space. In this case, we used a Kinect sensor, but it is set up in such a way that we can easily support other tracking hardware like Blacktrax or optiTrack motion capture.

We spend 2 days prior to the event setting up the framework and 3 days locked up in a theater in Amsterdam to build this installation. Frederik-Jan de Jongh joined us with all his audiovisual experience and we had a blast dancing like monkeys and trying things out. On the third day, we had asked the professional dancer Giacomo Della Marina to give it a try. We recorded 2 clips where both Giacomo and Fred did 2 rounds of a 2-minute improvisation. The second run is the one we used and put online.


We set up the architecture of the tool in such a way that it would be simple to hook up new effects and use the whole tool in a more design approach.

The biggest advantage of working with TouchDesigner is that the possibilities are endless. You can connect up everything to everything and create custom parameters to take full control of every detail in your network. This, however, can also be a dangerous path to step in to, by not setting any limitations we have found ourselves having too many options and getting lost in all the possibilities way too many times. Our broader research at Go Robot in TouchDesigner is how to control this ocean of options. Building complex systems which are easy to understand in the core, while still remaining the endless options in controlling your actual patch in a non-destructive real-time way.

User Interface / Controller

We’re a big fan of creating our own UI. The UI is a big part of the functionally of the tool and we always thrive to make it as intuitive as possible. We cut back on options as much as possible to keep it to a minimum and spend a lot of time researching external controlling options to stay away from dragging sliders with the mouse.

The Kinect tracking data is pushed through a small UI interface where tracking points can be selected. The selection then toggles a custom parameter in the effect component. This simple element consists of a background TOP and several momentary buttons.

A nice technique which we use a lot is to not use toggle buttons, but only momentary ones. When pressed, a script is run which tells which custom parameter it should toggle. The background of the button is linked to the state of the targeted custom parameter. In that way, you will have two-way communication between the buttons and the target component. You won’t get stuck in a situation where the custom parameter is set to on, but the button tells you off. Also, this is a great way to use one UI element for multiple custom components. You just have to link to the right component and button states are updated automatically.

We’ve set up a template Base to serve as our plugin setup. All the incoming Kinect channels are already hooked up and we could start right away with building the effect. The joint selection UI mentioned above is toggling the custom parameters.

Another technique which we use a lot is to dynamically set up a "layer-selection" system. When an effect is added, the UI will update automatically. This is done using an opfind DAT pointed at the effects components location. A replicator then reads this DAT to generate toggle buttons (again using momentary buttons) to toggle the render parameter of each effect. The toggle is linked to the Render parameter within every effect component.


We started out with adding little volumetric light spheres at the position of most body joints. After adding a bit of noise giving it an orange/yellow color to get some nice warm light.

The amount of light rendered is calculated using some funky math that involves a quadratic equation to determine the intersection of the camera ray and the light sphere. This all happens after rendering the beauty and depth layer of the normal geometry. (figure 2)

Since we gave the volumetric light spheres a bit of an orange flavor, it only felt right to add some smoke to it. We built a particle system that emits new particles at the hand positions of the dancer. The particles moved slowly upwards through time, grew in size and decreased its alpha. When we again added some noise and gave it a greyish color, it started to look like the hands were on fire. (figure 3)

Yet we were looking for even more fireworks :). So we decided to start using 'Quark', our more complex particle system for TouchDesigner (which is still in development). This allowed us to emit even smaller particles. After playing around with it, we decided to set the emitters to the hands, feet and head position of the dancer. (figure 4)

We liked the idea of emitting particles from the different joints, though we needed something bigger than just little points flying around. We decided to build a similar emitter, but instead of emitting little particles, we were instancing boxes of different sizes. Since we're asking quite a lot of our computer, we implemented the instance placing on the GPU in a vertex shader. This allowed us to do some little hacks quickly, such as setting the y position to 0 when it becomes negative, resulting in the feeling that the boxes collide with the ground and slide away. (figure 5)

Finally, to create a bigger trail of the dancer, we built another instancing effect. This time we used a trail CHOP on the Kinect input as instancing positions and directions. We connected some other parameters, such as size and rotation, to the audio input. (figure 6)


For the music, audiovisual artist Fred 'Hototo' de Jongh ( created a TouchDesigner based system in conjunction with Ableton Live, to have direct data-communication with the Kinect-instance of Tim and Roy through Open Sound Control (OSC).

TouchDesigner, all Kinect-sensor data is being reinterpreted in the audio-realm and is being multiplied with an octaved chord of a couple Audio Oscillators CHOPS, where their frequency and filtering is being modulated by the positional data of head, hands, and feet. This creates a spectral envelope of the general sound that behaves symbiotically with the movement of the user. Audiovisual symbiosis is the main game here.

This sound is further translated in Ableton Live using Dante as a means of audio over IP transfer. Together with a wavetable synthesizer and a sampled 909 drumkit, the audio is fed back through OSC to the interface , where it is modulating several different shader sources while staying in complete sync with the Kinect data itself. This, together with the other TouchDesigner Kinect instance, creates a full interactive audiovisual system.

Interested in working with us or just want to say hello?

Drop a note at: /