Close
Company Post

Theatre in the Time of Isolation: Building a Remote Production Desk with TouchDesigner

In the spring of 2020 Associate Professor of Practice Sven Ortel organized an introductory TouchDesigner workshop for the summer at The University of Texas at Austin. The pandemic and lockdown that ensued forced him to rethink what knowledge and skills would prove most relevant moving forward. Engaging Matthew Ragan, Zoe Sandoval and Ian Shelansky of SudoMagic they reached the idea to develop a focus on virtual production. The students were tasked to "imagine the screen as a malleable canvas" Ortel explains, "that could be manipulated and laid out to support the production they were assigned to either as media designers or media design assistants."

"my intention was to turn the challenge of socially distanced performance into a design and innovation opportunity"
— Sven Ortel

Ortel frontloaded the semester’s teaching with a TouchDesigner intensive open to students in his studio class, the Texas Immersive program as well as interested faculty and staff. The workshop was a big success in providing an introduction to TouchDesigner's abilities and distinct features; it also gave the means to do so to those who wanted to use the software specifically for virtual production and distinct projects - like media designer John Erickson and assistant media designer Ben Randall did in building a remote production desk for the production of La Ruta. We spoke to John Erickson who was kind enough to talk to us about his design and production criteria and to Ben Randall who provided a detailed account of his workflow in building out a system to successfully meet the needs identified with their remote production. 

In Isaac Gómez's La Ruta, we are told the heartbreaking story of a border town, a bus route and the women of Ciudad Juárez. Inspired by real testimonies from women affected by the ongoing femicides along the border, La Ruta weaves together beautiful storytelling and music in a celebration of the resilience of Mexican women in the wake of tremendous loss. Read more about this stage play and its production here: here.

Derivative: John as media designer for this production can you describe your goals in adapting La Ruta to a virtual production as well as the challenges you faced?

John Erickson: La Ruta was never meant to be a virtual production and had a nearly finalized stage design in March 2020. Director Anna Skiddis Vargas and I began conceiving of the virtual version of the show in July 2020. Our top priorities were:

  • The production being as live as possible.
  • We did not want our production to look like a Zoom call.
  • The actors need to be able communicate with as little lag as possible.
  • We need to be able to adjust and discover as quickly as we would in a regular tech process.
  • It had to have sound design, as well as the ability to adjust the volume of each actor.
  • It had to be a comfortable environment for the actors.

Those were the main edicts that led us to a lot of our early design choices, such as a uniform backgrounds for each actor (created by our set designer Tara Houston), the adjustable feathered edges on each of the video feeds, the need for a system that could give us individual video AND audio (via OBS Ninja), and the looping videos serving as backgrounds.

"How do you make a theatre piece when none of the actors can be in the same room?"
— John Erickson

My personal goal when beginning to build out the system was to create a show that could be run and designed in a way that felt like a typical live production. Thus the aim was to break out the system and the local video feeds so that each designer could work simultaneously while seeing and hearing the final product in real time. This also allowed us to have multiple board ops taking cues from our stage manager Skyler Taten, further mimicking the traditional theatre process. 

As a design challenge this was an amazing experience. It's honestly rare that you are given the opportunity to create something truly original and I'm very grateful for the opportunity. I also want to give all the credit to the rest of the designer who also found themselves in completely uncharted territory with special thanks to the Lighting Designer Emily Novak (who joined the team roughly a week before tech due to the original lighting designer dropping out). The Director Anna Skiddis Vargas who was incredibly adaptable to everything we threw at her while also designing choreography and actor movement that enhanced each video layout. And obviously Ben Randall without whom the show would not have been possible.

Derivative: Ben can you tell us how you came to be involved in this production and also a bit about your studies, interests, the tools you use?

Benjamin Randall: I am studying integrated media design for live events. Previously, I have worked as a software developer and freelance VJ/DJ. I do not come from a theatre background, but find TouchDesigner very useful for live video applications such as VJing. I was brought into the production of La Ruta after the pivot to a remote production was decided. The media designer’s role in this play, which previously was traditional media creation and stage projection, quickly grew into a massive, tangled up job of systems engineering, media design, direction, and audio engineering. John Erickson, the media designer, brought me in as his assistant. In this role, I took over solving the technical aspects of the performance, and early on we decided that our remote queuing/compositing system would primarily live in TouchDesigner.

Derivative: What was the design brief? What sorts of things did you need to be able to do?

Benjamin Randall: Our baseline goal for La Ruta was a live-streamed table read. You can imagine this as a Zoom stream of the actors rehearsing their parts from their individual homes, with less emphasis on a polished performance. As the production began to develop, the idea evolved into a Zoom/Skype call between all the actors with customizable window placements, dynamic media backgrounds, and programmable cues. From this point, we drew up a list of technical challenges: working with remote performers in multiple locations, compositing and arranging their streams in real time, having a familiar cueing system for John and our operator, and individual audio routing for all the actors. Through the planning process, John’s design choices influenced our technical needs and my technical research influenced the end design.

Derivative: Please take us through the process by which you achieved these goals .

Benjamin Randall: Our technical objective, as it became more focused, was to be able to ingest and manipulate nine different live feeds from actors in 2D space and to recall each unique layout based on cues sent from a seperate computer running QLab. We also wanted the actor feeds to have soft, blurred edges, rather than sharp rectangular windows. Finally, the media designer (John) wanted to be able to composite the actor feeds on top of content sent from QLab , and have that final video and audio be streamed to Vimeo.

Here’s a sneak peak of what we would end up designing, with this in mind:

So, with the goal outlined, let’s take a look at the final system diagram:

This is a little dense, so here’s the software system flow. The actor feeds are individually sent to the main PC via OBS.Ninja.

The OBS.Ninja browser sources are arranged in a 3×3 grid like so.

OBS also receives the audio from the audio desk. A soundboard operator is controlling OBS with an Akai APC MKII. Video and audio is routed to TouchDesigner on the same machine via NDI.

The entire TouchDesigner network looks like this:

Again, this is a bit dense, so I’ll break it down by section. On the left we have all of the controls that have to do with the actor input feeds. The actor feeds come in through the NDIin TOP from OBS. We found it useful to have a test feed to run tech in case the actors were not on camera. The cache is there to sync audio and video. The video player pulled pre-recorded cuts of the actors out of a folder for certain parts of the performance. The switching between these was done programmatically based on cue. 

The actor feed is then sent to the two cue decks (Deck A and Deck B) which are programmatically loaded and called based on cue. These bases are clones of base_fx_Storage

Let’s step into base_fx_Storage. In here, the composite feed is cropped 3×3 and each feed is sent to an effects base. The feeds are then composited back together after effects are applied. 

In each effects base, we can control each actor feed’s crop, soft edges, monochrome, level effects, size, and placement on a 16:9 screen.

So how are these settings stored and recalled? This system takes advantage of the internal storage of a TouchDesigner component. With a bit of Python, we are able to store and recall all of the custom parameters of each actor feed. This part of the project was directly sourced from two of Matthew Ragan’s wonderful tutorials: Case Study | Custom Parameters and Cues and Presets and cue building.

Now for the actual cueing system. The QLab Macbook controls all our cueing. On each cue, QLab sends an OSC string over the network to the TouchDesigner computer. This is an example of a sample OSC string. The media designer and I decided on a uniform cue format: "/Q/" + cue number + "/" + transition time.

In TouchDesigner, we would interpret this string with the code below. This code splits up the message into the relevant parts. A message such as /Q/10/2 is interpreted like this: msg_source is the source (“Q”), preset_name is the cue number (“7”), and trans_time is the transition time that the media designer would designate, in seconds (“2”).

Next, we want to load the next cue and change decks based on which deck we’re currently in. 

In load_preset, we fetch the stored preset from the storage base’s internal memory, and load each preset into the target deck. Again, this part of the system pulls directly from Matthew Ragan’s tutorial about using custom parameters as cues here: Case Study | Custom Parameters and Cues

Once the new deck has been loaded and crossfaded appropriately, we can composite the media video feed from the QLab computer onto this actor feed setup. We have two feeds below: a live feed (comp2) and a tech feed (comp3). 

The media designer and I could adjust and compare cues and not worry about disrupting the final feed during tech rehearsals. Below, you can see the screen I had set up to see the final feed, the tech/storage feed, audio signal, a list to load cues into the storage feed, and a settings panel for adjusting the individual actor feeds.

Finally, a few custom cues had videos that played on top of the actor feeds, so another video player was added. The final products are sent to a director/stage manager monitor and a tech monitor, as well as the streaming computer via NDI. 

The director’s feed was set up as a TV a couple yards away from the TouchDesigner computer. Because of the latency added by each step of the process, our stage manager needed to call tech cues based on the final product, but also needed to call staging cues based on the actor zoom, a difference of about 5 to 15 seconds, depending on the internet that day. Below, the stage manager Skyler is viewing the final feed on the left and the OBS screen on the right.

Finally, on the streaming computer, we had an OBS window that accepted the TouchDesigner feed via NDI and streamed that feed to Vimeo. 

And, voila! We had a fully remote show.

Derivative: It's a fantastic achievement and case-study in working within constraints to innovate, congratulations to you and John and the entire team! Has this experience led to any new ideas or plans for using TouchDesigner in your projects?  

Benjamin Randall: This experience has certainly given me new ideas on how to use TouchDesigner for live cueing systems. A breakthrough moment came when I first learned about the internal storage attached to COMPs. The process of storing, transforming, and recalling python dictionaries made the development of a custom cueing system simple to understand. As computer processors and graphics cards get more powerful, I see appeal in routing as much as I can virtually in one app, rather than relying on many pieces of disconnected hardware.

Derivative: We like to ask this question - is there anything you found to be missing inTouchDesigner, even small things that would make your work easier?

Benjamin Randall: In the same way some programs come with use case templates, TouchDesigner could benefit from some pre-built UI templates for a few base use cases.

Derivative: Can you briefly explain the process of learning TouchDesigner in Sven’s class, SudoMagic’s involvement, and any breakthroughs or “aha moments”?

Benjamin Randall: Mashing together a few of Matthew's tutorials with his help got me through this process. The most fundamental aha moment was the discovery of where the cues could live - whether that was internal COMP storage or a JSON file.

 

Download the source code (Benjamin Randall, 2020)

La Ruta Credits

Anna Skidis Vargas | Director
Isaac Gómez | Playwright
Skyler Taten | Stage Manager

Roberto Soto | Assistant Director 
John Erickson | Video Design
Ben Randall | Associate Video + Programming
Emily Novak | Lighting Design

Amber Whatley  | Lighting Design
Sam Wigington | Associate Lighting Design
Tara A. Houston | Scenic Design 
Lowell Bartholomee | Sound
Harold Horsely | Costumes

Jesse J. Sanchez | Composer
David Tolin | Technical Director
Yasmin Zacaria Mikhaiel | Dramaturg

 

 

Comments