Close
Company Post

Control Room 42 Ushers in the Future of Broadcasting

 

Control Room 42 (CR42) a project from RTBF, public broadcaster for the French speaking part of Belgium, gives broadcasting's traditionally hardware-based control room a radical makeover enabled by TouchDesigner in ways its designer Hugo Ortiz thought impossible a few years ago. Recipient of The European Broadcasting Union's Technology and Innovation Award 2020, this new software-based control room prototype that also integrates Artisto for audio and Smode for real-time graphics brings game-changing innovation to the broadcasting industry.
In the broadcast environment CR42 is lighter, faster and costs up to 10 times less than the hardware-based equivalent. It allows the creation of a workflow that can reduce the number of people required to produce news content to only 4, and empowers creatives with a much more intuitive and agile way of working on the fly. We spoke to RTBF Innovation Officer and live production nerd (his words!) Hugo Ortiz about what started as a user experience project and the idea of uniting all control room technicians behind one universal user interface became a reality with TouchDesigner in less than a year. "We wanted the control room itself to be the tool. You enter the control room, you interact with one tool - it’s the CR." Impressive and exciting, CR42 may well be the answer to everything!
CR42 is built in TouchDesigner and serves to integrate UIs, controls, backend, video switching, PiP effects, video and audio playout. It drives live audio mixing and FX in On-Hertz Artisto, and realtime artistic graphics, title and studio screen generation in Smode.

 

Hardware, software and protocols that CR2 talks to:
On-Hertz Artisto: Audio Core software sound engine: Websockets

Smode Tech Smode: Realtime graphics software: OSC
CGI OpenMedia: Broadcast Newsroom software:  MOS protocol (Rundown)
Netia DDO: Radio playout software:  SignalR API
ROSS Furio: robotic camera:  TELNET API
Yamaha TIO1608: Dante audio stagebbox Preamp gain control:  UDP API (reverse engineered)
MA Lighting grandMA2 onPC: Lighting control software:  TELNET API
SKAARHOJ: control panels, hardware:  TCP API
Fader banks: (Asparion & Behringer):  RTP-MIDI
Blackmagic ATEM Constellation: hardware switcher: ATEM API

 

Derivative: Hugo first of all, a massive congratultions. Give us a little background as to how CR42 came about.

Hugo Ortiz: Broadcast is a challenged industry and we need to drastically move forward in the way we approach production of content. Today it’s mostly a very heavy process involving a lot of very expensive fixed-functions hardware boxes and tools, with workflows that are fixed for 10 years as a result of that… 

The power of computers and GPUs can now challenge this. I’ve been searching years for a software/platform to experiment quickly on new ideas for live production using software processing and UIs. The broad vision was to create a whole TV production control room with 100% custom touchscreen based UIs that are vastly simpler and easier for the creatives than what they are used to by abstracting the notion of different equipments. TouchDesigner enabled this project to an extent I didn’t think was possible a few years earlier. It’s funny how something that doesn’t come from our industry can be so useful to it. 

Derivative: It's a huge undertaking... what was your process?

Hugo Ortiz: In less than a year in 2019-2020 I went from pulling my hair out trying to make TouchDesigner do what I wanted it to do and getting fluent in Python, to loving it. I created multiple backend TouchDesigner processes that run on servers for video-switching, video and audio playout, image processing, control, APIs and production automation. These processes talk to users’ UIs (TD touch UIs on Microsoft Surface Studio 2 and big 4K TVs) through multicast NDI and OSC. The project took off with creative users as the ability to change anything to their liking only takes a few hours. We’d do a test production, debrief and list the changes they would like, and repeat the next day with all the changes implemented. 

Derivative: Can you explain the parts of CR42 and how a typical news show plays out in this system? 

Hugo Ortiz: There are “production assets” which include videos to be played, pictures, titles, audio jingles, live feeds etc. and there is a rundown/playlist that is extracted from a “newsroom system” where the rundown is created by the journalist. Then the system is a mix of automation using the rundown and live improvisation to stick to the news feed that we can’t really predict.

Derivative: You talked in a Q&A session about real-time processing in the cloud. Is that key to the future?

Hugo Ortiz: We used TouchDesigner in GPU powered Windows 10 virtual machines in the cloud (provider: Paperspace) for dev purposes. In the current Control Room 42 all the I/O video feeds are local and SDI based, but we've done successful experiments with SRT video streaming. The addition of the Webserver DAT is amazing for this. Now I dream of a WebRTC TOP to be able to easily provide very low latency video and audio feeds from the cloud to browsers and back :) ed: stay tuned

Derivative:  The open knowledge sharing when it comes to development in your organization is appreciable. We met Floris Daelemans of VRT at the TouchDesigner Professional Aliance 2020 in Amsterdam who impressed us with the work they had been doing with TouchDesigner in broadcasting. 

Hugo Ortiz: I think doing innovation alone would be sad… We try to share as much of our experiences and data as we can. VRT is the RTBF sister company, the public broadcaster for the Flemish side of Belgium. We share the same building! Floris Daelemans and I are collaborating under the VideoSnackbarHub initiative, we try to share our experiments to the world :-) We’ve created LIVA, a github where we share small components made for broadcast production. We also share our experiments and TouchDesigner findings to the broadcast industry in cross-media seminars like the ones from EBU.

Derivative: What is the general architecture of CR42?

Hugo Ortiz: CR42 has evolved a lot. Basically it's now a MVC (Model View Control) the system is strictly composed of UIs separated from a central UI-less control system. It's represented by the schematic in the video.  

Derivative: And what about LIVA? Is it a set of components that can be assembled in a base architecture and can it be seen as a scalable environment applicable to home and professional studio setups alike?

Hugo Ortiz: LIVA is made up of smaller and simpler components that can work on their own, with processing and UI integrated. This way, they are easily reusable and shareable. They are meant to be used to help create Broadcast applications. This include components like VUs, Audio faders, Video switcher system... They run on local systems as well as cloud GPU VMs, so they are quite versatile. 

Derivative: What are you thinking of exploring in the future? 

Hugo Ortiz: After one and a half years of development Control Room 42 is gaining recognition externally and internally where it starts to be a design inspiration source for future production control rooms at RTBF. Currently we are broadening its usage studies with radio production workflows. We are also looking at things like AI for automatic people framing, automatic color shading or automatic directing. News SDKs like Nvidia Broadcast Engine that uses RTX core to provide AI video upscaling, audio noise removal and face detection are things we plan to play with.

Derivative: CR42 could be quite empowering in allowing, as you mention, people to work remotely during times of Covid. This type of control room could also more generally speaking gives voice to people with fewer resources or in more remote parts of the world. Sort of a democratization of broadcasting? 

Hugo Ortiz: Yes, it is! This trend has been going on for years with technologies from the live streaming world for smaller productions. Here we can offer the same advantages to large, scalable production infrastructures.

 

 

 

Comments