MOTHER was even put to uses that were not originally intended. This is what happens when TouchDesigner is used on-set, on-stage or (virtually) in-orbit. Boris Morris Bagattini (aka Chris Wilson) lays it out here.
"MOTHER" (interface below) was the remarkable (and very beautiful) TouchDesigner application designed for the task by Wilson. The application is based on replicated TouchDesigner Components, each having its own interface with all relevant settings for that screen. Further to the director's brief, the system feeding each screen needed to be able to:
- Composite and control up to 8 layers of screen graphics videos, HUD (heads-up display) overlays and live camera feeds.
- Apply and control several effects to different layers on each screen and adjust to the director's requests.
- Drive precisely-cued playback from producer calls through to then end of the shoot.
- Quickly wrap around screens as per the director's instruction.
- Quickly load and play newly requested content as it was provided.
- Enable the team to rapidly develop director-requested features, and distribute back to the screens.
- Save out entire scenes so they could be recalled and used on request.
- Color and level-adjust all or individual screens independently.
Derivative: Chris the design brief was daunting and the functionality required extensive! How did you go about building this system?
Chris Wilson: I received a call from Martin Crouch of PXL a motion design studio based in Sydney. He had been engaged to lead the Screen Graphics Department and was looking at options to fulfil the requirements of the software and hardware brief for Alien Covenant. Ridley Scott was unhappy with the level of control he was getting from existing systems he had used on previous productions and had a long wish list of functionality he wanted that wasn't currently offered. Martin had experimented with TouchDesigner on a few smaller productions, and had seen what was possible. Knowing the number of feeds required and the level of control needed, it was Martin's idea that TouchDesigner was going to be the best platform to handle all of the on set playback requirements.
The first obstacle was how to structure the entire system. I began by thinking about each screen as a stack of layers, each layer either being a piece of footage, a live video feed or custom .tox component and that the UI would be simply a method to make choices, provide feedback, but ultimately all processing would be handled by the slave nodes out on the set. I had built an initial sketch in TouchDesigner and then started to focus on the daunting hardware that would be involved.
Martin Crouch had already amassed a collection of bare LCD screens of various sizes and types, and was working with a Japanese partner on sourcing driver boards. These would ultimately be assembled by Martin and also Steven Paul, our on-set hardware technician.
We initially had a lot of problems with EM (electro-magnetic) interference due to the sea of radio and electrical noise that flows through a set during shooting and it was a battle that continued and only finally eliminated later in the filming schedule. I am sure a film set is the most hostile environment on earth for integrating sensitive electronics!
While I was battling with hardware and signal paths, I handed my initial sketch over to the excellent Peter Walker who began to write a class-based Python framework to unify all control information into a single base component that could be called from anywhere in the network. This was an elegant and efficient structure that worked well, but not without some stressful debugging occurring on our first set. Ultimately from what we learned I decided to rewrite MOTHER from the ground up to maximise its stability and flexibility, and this is the version that went on to finish Alien, then move on to the other films Pacific Rim Uprising and Aquaman.
The current version of MOTHER is based on replicated Components, each having its own interface with all relevant settings for that screen. The number of screens are set in the master configuration which automatically addresses the slaves based on the remote computers' physical video output and the corresponding TouchDesigner instance running under GPU affinity. Monitors are accessible in groups of eight in the interface and settings can be linked across monitors and cues for arbitrary flexibility.
Ultimately all interface settings are condensed into a single Touch Out DAT that is received by the slaves on an assigned UDP port. The Touch Out DAT dynamically increases and decreases in size to scale additional information such as layer-types and specific .tox settings that may be loaded on the fly for that screen (A "tox" is a TouchDesigner component file that gets loaded on demand into a TouchDesigner session).
All footage, layer types, effects and tox extensions are automatically copied to the slave computers in the background throughout the network of servers. This ensures any imagery or generative elements are local on the slave computers for minimum latency and direct processing, regardless of the load on the master. All elements across the system are synced through a master clock.
D: Can you give us a little background on your work history and how you came to adopt TouchDesigner and have the kind of trust in the software that lead you to using it in such a critical scenario on Aliens?
CW: I come from a visual effects and motion graphics background and started to move into live events. I originally began using Resolume as my main platform but discovered TouchDesigner 077 and it was a revelation. The unique interface and visual feedback was addictive and played into the established node-based creative processes I was familiar with in VFX.
I have been using TouchDesigner since 077 for large-scale theatre, projection mapping and interactive artworks. I'm a bona fide Touch nut. I have toured TouchDesigner-based dance projects in Australia, the UK and Canada.
Some of these projects have been very complex and I have always worked in theatre productions with real-time generative graphics and lighting control. I like to keep everything live, being able to react and perform with the on-stage performers so each show is slightly different. To me it makes the whole experience both as an audience member and as an artist more engaging.
Theatre is even more critical than film as if something goes wrong you can't go for another take. After doing many shows using TouchDesigner with some crazy set-ups and never having a problem with the stability of the platform or a creative challenge TouchDesigner could help solve, I had a lot of confidence going in. I was totally terrified though as well.
D: What are some of the functions you designed that makes MOTHER distinctive and original?
CW: MOTHER works as a distributed media server and real-time graphics generator tailored to on-set screen graphics.
A unique aspects is that it runs robustly on a variety of standard hardware from multi-Quadro workstations to laptops and Microsoft Surface Pros. The latest version integrates with cheap and small single-board computers, providing flexible hardware installation options. It can mix multiple screen sizes from multiple hardware vendors. Access through a unified EDID process results in no tearing or camera strobing artifacts at any frame-rate that the director wants to shoot at.
It is totally customizable through the tox system and scene/script specific animations.
Procedural or pre-rendered scenes can be built on the fly even during shooting and then loaded transparently without affecting the on-set imagery, even during a take. Master and slave are protected as all updates and control are on-demand, and procedural animations are locally processed.
D: Can you explain why TouchDesigner was an ideal choice for creating the platform.
CW: TouchDesigner was ideal in that it enabled:
- the development of a unified media server platform that could be controlled over a network to media servers, each running 3 instances of the software, with each instance outputting up to 4 screens, each with 8 layers and multiple effects.
- deployment of the same centrally controlled system on floating laptops and Surface Pros that could be moved to any location where floating screens might be requested.
- tailor effects, playback control and generative content, deploying them instantly across the network.
- creation of unique layer types that might within themselves composite, apply effects and generate content.
- customization of a UI and system to reflect the unique needs of the project.
- creation of our own routines to quickly identify the production team's name for the screen and locate them in the server network.
- the resolution and orientation of screens customzable on-the-fly.
- all configurations being centralized back to a master cue player so that a complex scene could be controlled with simple UI interactions.
- multiple instances being deployed in a single machine to keep hardware costs minimized while maintaining performance.
- routing multiple live video feeds to all instances on the network efficiently.
D: What were some of the incentives for creating MOTHER from scratch when there are existing software solutions you could have deployed?
CW: We decided to deploy a TouchDesigner system that used readily available hardware such as gaming motherboards and Quadro cards rather than a turnkey solution with overpriced licensing that was not tailored to our needs and could not be arbitrarily customized for both hardware and features.
The need to prototype and deploy fast turnaround requests was the ultimate reason. And TouchDesigner's procedural nature and high efficiency meant we could balance time constraints and loads on the graphics department and enact procedurally generated layers that mixed seamlessly with pre-generated content.
There was also a proportion of directly interactive elements that ran on Surface Pro, plus the need to pipe in Go-Pro and Alexa mini footage directly into the system through Blackmagic capture cards on the fly along with multi-layer overlays and color correction. No other software solution provided such flexibility.
Whatever crazy requests came through from the art director or director we were confident we would be able to find a solution in TouchDesigner in a short amount of time!
D: Because we are curious, can you give us an example of a "crazy request" from the director?!
CW: I remember one incident on the Lifter set. This was a 30 ton spaceship set that was constructed in an old storm-water reservoir. The whole set was on a massive hydraulic gimbal that allowed it to move freely and violently in any way, we had rigged the TouchDesigner servers underneath the ship with umbilical cords and bungee system that ran up through the sets centre of mass to reduce strain on the cabling.
We explored the idea of placing the servers on the ship but the size and violence of the hydraulics made it impossible. We had rigged up a gyroscope to feed into TouchDesigner to give us relative positional and rotational data to drive some of the instrumentation on the ship. It was all working well with me set up at control behind a massive wall of shipping containers that served as a giant blue screen.
We had fulfilled the brief for the set but Ridley was not happy with the actors not being able to know what they were supposed to be reacting to, so suddenly we had the camera department, visual effects and GoPro team asking us to coordinate a mocked pre-visualisation of the scene in real-time across the monitors in the ship.
We had footage coming in from video assist of Alexa footage that had just been shot of a big stunt where the ship crashes into a statue and Katherine Waterson is thrown over the side of the ship, we had live video feeds of the ship exterior and all this pre-vis footage from VFX of the alien and the ship in this big showdown.
The monitors in the ship went from screen graphics to a kind of live switching studio where we would be cueing segments and sources across various screens and with Ridley calling edits during the take so the actors understood what was going on in the scene and had stuff to react to. I think it was great that the system could actually make it all happen in-between takes and for a completely different purpose than for what it was intended. I think it improved the timing and engagement of the actors, which is why Ridley tries to build as much as possible practically and not in post during his films, he's not a fan of people acting blindly in front of a green screen.
D: Can you discuss some of the critical factors that make deploying a system like MOTHER on a major motion picture valuable and cost-effective?
CW: I think scale and the broad mix of hardware used on a film set makes MOTHER unique. We had screens the size of iPhones through to 70 inch 4k monitors all running in sync and off the same unified system. These were spread through complex sets with up to 70+ monitors that integrated thousands of practical LED lighting, SFX such as squibs, sparks and fire, and in the case of the Covenant and many other sets, suspended on a multi-ton hydraulic system to make the whole set move.
The pace and cost of shooting also makes robustness critical. With each take costing around 12,000 USD you don't want the system failing mid-take, eliminating a significant portion of the visual impact of the set, disrupting the actors and director and requiring costly delays or screen replacement in VFX.
Flexibility and rapidity of deployment is also the hallmark of the system. On the film Pacific Rim Uprising we had a 72 monitor set to install from scratch in 2 days. Most of this time is purely the installation of the screen hardware, with only a few hours to deploy the software and get everything up and running on all screens with the correct imagery, effects, layering and level controls.
Sometimes a director may have a complex new visual idea once he has seen the set and this may need to go from brainstorming a solution to a final cue-able scene within an hour sometimes. While shooting scenes with a single operator, rapid changes to imagery and content over large numbers of screens is where the system shines.
D: What advantages do you see in using real-time graphics on film sets vs. pre-rendered VFX or effects added in post-production?
CW: For the purposes of screen graphics real-time, in-camera is superior to VFX. It simplifies post production enormously reducing the need to track and rotoscope large portions of a film set and burn in imagery, especially with high depth of field and lots of actors occluding portions of the screens.
It also provides realistic ambient light from the screens which is always a concern for the director of photography who needs to balance very subtle light levels, and with many scenes in modern movies driven by screen-based plot points where actors are looking at a screen and reacting to its content, it helps enormously for that to be a real thing happening in front of them rather than a green square.
I think other areas of VFX such as set extension, realistic creatures and destructive elements are a while off, but there are great advantage in utilizing real-time techniques on set to pre-visualise post-produced environments or characters, allowing the director and the actors to better understand the virtual aspects of the scene.
D: What is your prediction in terms of real-time effects becoming more common in the film industry? What do you see as 'the path'?
CW: I think technically we are not there yet, despite the enormous advances being made in game engine development. I think technically we will come ever closer, however I still see films using non real-time techniques as primary for a long-time to come.
I do think that more and more cameras will integrate hardware and software solutions that will capture many aspects of the filming process that were traditionally post-based to being in-camera and real-time, eliminating the need for green screen, rotoscoping, lightfield capture and tracking. All the information needed will be captured by the camera itself, this will speed up both shooting and post-production enormously.
Artist Profile
Boris Morris Bagattini
(aka Chris Wilson) is a founder and director of SomaCG, a film, motion graphics and visual effects company. He has studied Design at UNSW, Digital Cinematography at AFTRS and Advanced Character Animation with Disney Feature Animator Murray Debus. He has directed and led visual effects teams on a multitude of film and broadcast projects. Since 2011 he has been working primarily in large and small scale theatre, projection mapping, event video, live television and interactive artworks.
His films have been shown at Sundance, Toronto Film Festival and Sydney Film Festival. He has collaborated with Stalker, Legs on the Wall, Strings Attached, DequincyCo, Synergy Percussion, Victoria Hunt, The Chaser and SCO, and has had major work commissioned by Sydney Festival, Vivid Festival, Nike and Apple. In 2016-2017 he has been engaged as Screen Graphics and In-Camera Interactives Programmer for Ridley Scott's Alien Covenant, Pacific Rim 2 Uprising and the DC Comics feature Aquaman.
Massive thanks to Chris Wilson for taking the time to talk to us and for congratulations on this great achievement. We look forward to your next endeavors!
All images unless otherwise noted copyright 2017 Twentieth Century Fox Film Corporation.
For another case-study of how TouchDesigner has been used on set of a major motion picture film please see Gravity: TouchDesigner Works in Zero-G