Derivative: Who instigated the show and how did it come about?
Noah Norman: Venues and musicians are struggling. The events business has been all but dead for months. NYC’s live music scene is hanging on by a thread. But the community around Elsewhere and the emerging artists they champion are as hungry as ever for connection, for new experiences, and for live music.
With that is mind the production team - Elsewhere, Peter, Dark Igloo and myself - stepped back to consider what about the experience of seeing music live could be brought into a ‘virtual’ or fully-digital experience at home and what about various ways of doing that brought out new opportunities. We wanted to make sure we were creating something that played to the strengths of our chosen medium and that wasn’t just an attempt at creating a facsimile of the experience of being in-person at a venue, however realistic or stylized.
We wanted to create something entirely new, and to take advantage of the opportunity presented by the lockdowns - the uniquely high tolerance for screen time and the attention to detail that comes from being deprived of novel experiences and spectacle - to help transport viewers somewhere else with the help of their participation.
We think Twitch represents a new medium in this regard - it’s a platform that truly enables bidirectional communication and one where the conversation and activities of the viewers aren’t just a sideshow but are really the point of what’s happening. That’s a great fit for Elsewhere, because Elsewhere has always been, and continues to be, a community.
Derivative: Can you take us through the staggering system you built and explain why all the development is in TouchDesigner?
Our show application is a growing TouchDesigner beast. It’s a video pipeline with an A/B switcher as its dominant metaphor, but that core component obscures a lot of complexity and processing.
At the top of the pipeline are two Notch TOPs that do our chroma keying and use NVidia’s AR SDK to do face-tracking and sling that telemetry out over OSC back into the Touch context so we can punch in on faces to vary the composition of our shots.
From there every scene in the A/B engine is its own modular compositor with its own set of rules. Some are simply flat compositing of sources over backgrounds, but some are 3D, some are in a weird middle, and all are composed using colorful, surreal original graphics and animations made by our friends at Dark Igloo and Felt.
On top of the textures composed by the main switcher/scene composer pipeline are a number of augments and ‘gags’ connected to the Twitch API. There are screen-space effects, interactive simulations, a particle system disguised as a ticker, a voting engine, network control of an Ableton Live instance, up to 7 cameras at once, live video calling, multiple record decks, and control of physical gags within the studio, all driven by the main TouchDesigner process.
Our connection to Twitch listens for PubSub events like Bits, Channel Points, and Subscribers, and we've made custom Channel Points events for the show. We hit the Twitch API for stats that show up in places like the ticker and the credits, and to inform how we level-set on things that rely on certain amounts of participation like voting and group bits goals. Lastly, we're of course connected to the Twitch chat so we can listen for '!' commands (which are not a built-in Twitch thing but are a convention among interactive channels) and so we can reply as Elsewhere in the chat from Touch. Twitch handling is in its own Touch process and pipes back and forth all the data and events via strings over OSC.
In the studio itself, that same TouchDesigner instance presents the operator (me) two screens, a UHD screen to our host Peter, which is packed with heads-up info and video feeds as well as a Web Browser COMP render of the Twitch chat, and four screens in the guests’ space, each of which presents varying views on what’s happening in the show, messaging from backstage, and prompts from the viewers.
Peter and I each have a Stream Deck controller - a little grid of buttons with LCD screens behind them, that I’m able to rapidly modify to include new ideas as we come up with them. That’s our cheapo control deck and it’s way better than control surfaces costing ten times as much because the backlit screens make them easy to read and the software makes them easy to rearrange on the fly.
Having done quite a few shows that aspired to this level of juicy digital interactivity and aliveness, I can say that attempting to do this kind of thing with hardware would have come with a budget orders of magnitude greater than we had to spend and would have required a small army to be onsite during the shoot, something entirely impractical during COVID.
Trying to do this entirely in software would have required either making an ongoing and increasingly complex series of crazymaking compromises and brittle connections between off-the-shelf softwares or far more time than we had to do it in something less nimble than TouchDesigner - something like oFx or Cinder.
I’m the only developer on the project. We’re using one machine. We have one Touch license. And on-set we’ll have a COVID-friendly production of one person per room in five rooms.
It’s been refreshing to show this to the team on this job. We’re all old friends but they’ve never gotten to see what TouchDesigner is really capable of in realtime, so I’ve been able to surprise them over and over again by bringing to life the craziest stuff we can collectively come up with and by repeatedly saying ‘the answer is almost always yes.’
ELSEWHERE SOUND SPACE CREDITS
Created by: Elsewhere Studios
Hosted by: Peter Smith
Technical Direction: Noah Norman (Hard Work Party)
Art Direction: Dark Igloo
Creative Direction: Jake Rosenthal (Elsewhere) & Noah Norman
Produced by: Jake Rosenthal (Elsewhere)
Creative Writing: Ashok "Dap" Kondabolu, Dark Igloo, Peter Smith
Studio Direction: Chris Madden
Sound Design: Chris Madden
Live Editing: Noah Norman
Talent Booking: Rami Haykal (Elsewhere)
Video Art: Devon Moore (FELT Zine)
Production Consult: Chris Willmore
Special Thanks: Will Adams, Meredith Suzuki