Close
Company Post

Vincent Houzé and Dave & Gabe Bring Wondrous Créatures to Life in the Dome

Créatures is the first collaboration between NYC-based artist Vincent Houzé and Brooklyn-based studio Dave & Gabe (Dave Rife and Gabe Liberti) who have respectively been driving forces in the development of fantastical GLSL particle shader systems and complex interactive and visual projects involving "spatial sound".

Born from Houzé's fascination with generating strange life forms, and incubated in Montreal at the SAT's creative programs of immersive works, Créatures is the extraordinary and very challenging first venture of the 3 artists into the wide (wide) world of the dome environment.

The result was vastly wondrous and playful with an ethereal, even extraterrestrial quality, inspiring in so many ways! We spoke with the artists to learn more about their process in creating the work and experience with TouchDesigner in the dome.

video credits Sébastien Roy and Josh Knapp

An ode to the diversity of life forms on earth, Créatures explores various semi-abstract ecosystems, from the microscopic to the macroscopic, made of algorithmic landscapes, plants and creatures, whose combined simple behaviours give rise to an emerging complexity. 

Créatures is an immersive audio-visual experience with real time graphics and spatial sound in which the visuals and the sounds interact seamlessly, mutually driving each other. It was created with the support of SAT Montréal and premiered in the Satosphere for a week in May 2018, with an extra performance during Symposium IX.

Derivative: Can you tell us how the project came to be?

Vincent Houzé: I have been fascinated by strange life forms found in nature for a while and wanted to create a performance around that theme. I was not trying to recreate existing life forms, but drawing inspiration from nature with the freedom of computer graphics to also deconstruct these invented creatures with interesting visual effects.

One of the goals was to have the creatures move organically with simple controls, paint them with lively colors much like in nature, and use more geometric black and white graphics only sparingly.

I had started exploring these ideas and developing tools and systems for them in the 2017 clip for Max Cooper's Seed which was also made with TouchDesigner, though the final output was a rendered video. The Satosphere at SAT, a 360x210 degrees dome in Montreal seemed a logical next step with its super immersive environment.

It was my first dome project - which is challenging for a real time performance with the huge surface to render and 5 cameras that "see" all around the viewpoint.

I reached out to Dave & Gabe for the spatial sound. I thought they would be intrigued by the great spatial sound capabilities of SAT and could share their experience with TouchDesigner and spatial sound interactions, and luckily for me, they were!

Dave & Gabe: To work with Vincent on a project of this scale at SAT was totally a dream. We've been working with spatial audio and multichannel music for a few years and love when we can link the movements of sounds directly to a corresponding visual. We've done this in simpler ways with LED light in the past, but being fully immersed in the organic world of Créatures was a rich and inspiring landscape to explore.

We also have never worked inside of a spherical sound system like that one. The Satosphere is comprised of 157 loudspeakers that are "grouped" into 31 addressable channels of resolution. It was a real treat to be able to work with a system that sounded so good! In our studio in Bushwick, we used our 40-channel system to prototype some of the spatial movements, but once we got to the dome and we heard things there, the real work began.

D: Can you tell us a little more about how the look and motion of the various visual elements is handled?

VH: I used various custom systems for this project, some that were specifically created for it and some I re-used and improved upon from past experimentations:

In the first scene where there are some tiny cell-like creatures that grow and shrink and push each other, this is a custom GLSL particle system controlled by a texture where each of the particles acts on its neighbours creating this very cohesive movement.

For the underwater creatures, I developed a ragdoll system simulated with the Bullet Physics library in a C++ CHOP, building upon the Bullet CHOP plugin I shared a while ago. Using a couple of extra python scripts the system takes an FBX-rigged model and creates colliders with constraints to be physically simulated.

A flocking simulation made in GLSL is used for the motion of the different creatures allowing for tighter control of the Creatures by moving only a few target attractors for the Creatures to follow. The physical, ragdoll animation is applied to each instance on top of that.

For the long dynamic strands that are used both for extra details for the underwater creatures and for the walking humanoids in the last scene, I'm using another custom C++ CHOP, this time to leverage the Nvidia Flex library. This could be done differently - for example purely in GLSL - but I was building on my experience with the Flex library that I have also been using for liquid simulations, one of the outcome being the FlexCHOP plugin I shared last year.

For these two cases I created GLSL shaders taking the CHOP channels from the simulation to apply the physical deformation to meshes in TouchDesigner. A lot of the time I'm using hardware instancing where each instance is getting different information from the CHOPs.

Last but not least I used the DistThreshold CHOP C++ CHOP I shared a few years ago for extra visual effects.

More custom GLSL shaders were used for the complex audio-driven hierarchical animations of plants. Some testing for that on a simple scene below.

I then developed a simple CHOP and replicator-based culling system for the largest environment so that only terrain and vegetation within a certain distance of the camera would get rendered.

Besides all the animations which take place in CHOPs and the vertex part of the GLSL shaders, procedural Voronoi textures generated in GLSL were used and combined with simple color ramps in the fragment shaders to get an organic but cohesive look for all the creatures and plants. An environment map and PBR materials are used to get more detail in the lighting.

For the live performance a lot of controls were mapped to a midi controller so I could make the creatures perform the way I wanted!

D: Can you expand on the interaction between the sounds and the visuals and how you achieved the interplay?

VH: Dave, Gabe and I wanted this tight interplay between the two so it's a two way relationship. Some visual elements get animated based on audio inputs, while at the same time the creatures send their 3d positions relative to the camera with OSC to the spatialization system.

D&G: Like Vincent said, we wanted to tie the visuals and the sound together as much as possible. To accomplish this, we set up connections that went in both directions. For each creature, we sent their XYZ positions in terms of the camera space to the SPAT system. This allowed the position of the sounds in our spatialization software to match the position of the characters as they moved around the dome. (see video below) 

In other cases, parts of the music could control or activate different animations. In the first scene, the pulsing of the cell-like creatures was driven by the amplitude of the sound at that position in the dome, which made for a very satisfying and immersive combination. Later, tubular creatures attached to the seafloor would open their mouths in reaction to horn swells and kick drums in the music. This data was sent via Max For Live objects using OSC.

For the final scene with the walking humanoid giants, we pulled off something really special. Vincent was able to send us an OSC message each time one of the giants had their feet hit the ground: left foot, right foot, etc. We then used this to trigger unique footstep sounds based on how tall the giant was — smaller giants would move quicker and have higher pitched steps, bigger ones would move slower and have deeper step sounds.

Then, in the spatialization software, we could make sounds quieter depending on distance from the camera. It made for an incredible scene where you could hear these quiet steps in the distance coming from behind you and when you turned around to look there would be massive giant coming towards you! Then the giant would walk by and fade into the distance. The audience really enjoyed that.

A BIG thank-you to Vincent, Dave and Gabe for taking the time to talk to us!

vincenthouze.com | Facebook | Instagram

daveandgabe.care | Facebook | Instagram

Comments