Close
Community Post

Building a Remote Production Desk with TouchDesigner

In Isaac Gómez's La Ruta, we are told the heartbreaking story of a border town, a bus route and the women of Ciudad Juárez. Inspired by real testimonies from women affected by the ongoing femicides along the border, La Ruta weaves together beautiful storytelling and music in a celebration of the resilience of Mexican women in the wake of tremendous loss. Read more about this stage play and its production here

I was brought into the production of La Ruta after the pivot to a remote production was decided. The media designer’s role in this play, which previously was traditional media creation and stage projection, quickly grew into a massive, tangled up job of systems engineering, media design, direction, and audio engineering. John Erickson, the media designer, brought me in as his assistant. In this role, I took over solving the technical aspects of the performance, and early on we decided that our remote queuing/compositing system would primarily live in TouchDesigner.

Our baseline goal for La Ruta was a live-streamed table read. You can imagine this as a Zoom stream of the actors rehearsing their parts from their individual homes, with less emphasis on a polished performance. As the production began to develop, the idea evolved into a Zoom/Skype call between all the actors with customizable window placements, dynamic media backgrounds, and programmable cues. From this point, we drew up a list of technical challenges: working with remote performers in multiple locations, compositing and arranging their streams in real time, having a familiar cueing system for John and our operator, and individual audio routing for all the actors. Through the planning process, John’s design choices influenced our technical needs and my technical research influenced the end design.

Our technical objective, as it became more focused, was to be able to ingest and manipulate nine different live feeds from actors in 2D space and to recall each unique layout based on cues sent from a seperate computer running QLab. We also wanted the actor feeds to have soft, blurred edges, rather than sharp rectangular windows. Finally, the media designer (John) wanted to be able to composite the actor feeds on top of content sent from QLab , and have that final video and audio be streamed to Vimeo.

Here’s a sneak peak of what we would end up designing, with this in mind:

 

 

 

OBS also receives the audio from the audio desk. A soundboard operator is controlling OBS with an Akai APC MKII. Video and audio is routed to TouchDesigner on the same machine via NDI.

 

Again, this is a bit dense, so I’ll break it down by section. On the left we have all of the controls that have to do with the actor input feeds. The actor feeds come in through the NDIin TOP from OBS. We found it useful to have a test feed to run tech in case the actors were not on camera. The cache is there to sync audio and video. The video player pulled pre-recorded cuts of the actors out of a folder for certain parts of the performance. The switching between these was done programmatically based on cue. 

The actor feed is then sent to the two cue decks (Deck A and Deck B) which are programmatically loaded and called based on cue. These bases are clones of base_fx_Storage

Let’s step into base_fx_Storage. In here, the composite feed is cropped 3×3 and each feed is sent to an effects base. The feeds are then composited back together after effects are applied. 

In each effects base, we can control each actor feed’s crop, soft edges, monochrome, level effects, size, and placement on a 16:9 screen.

Comments