After years of developing different customized scheduling- & content-management-systems for exhibitions and mediatecture (media surfaces integrated in architecture, e.g. media facades), we have developed the Intermedia Prism as a product to fit all those “special needs”. With Touchdesigner as backbone for complex media playout scenarios our goal is to create an abstraction for complex media surfaces and enable users with minimal technological knowledge to manage content and create schedules with ease – without the chance of breaking anything.
For artists we create new possibilities, as various media surfaces can be controlled in a holistic manner, while still providing you with an easy way to schedule different content depending on the time, season or spontaneous events. Creating room-filling interchangeable experiences becomes easy!
How is the system designed?
Our central piece is a web application written in Elixir. Its easy-to-use interface aims to fill a gap between signage-systems and media servers: In permanent installations with complex media surfaces (e.g. LED-Walls) signage systems often are not flexible enough, while media servers, game engines and real-time renderers often require special trained personnel to manage the system. With Prism we aim to make it easy and safe to upload new media and change schedules - without having to constantly think about how a concrete playout system might work. We can even prevent you from making mistakes, e.g. schedule predistorted content (e.g. for video mappings) on a surface with a wrong technical format. But how do we tackle this and how is this connected to Touchdesigner?
Each asset that gets uploaded by a user gets validated and can be assigned a specific format for “special” screens. Using a shared storage the file can be accessed on the playout servers.
The playout for complex surfaces is handled by one or multiple servers running Touchdesigner (either in a Main/Backup setup or as an expansion/cluster). Our basic patch looks quite empty “before startup” with only an api-tox for communication with the backend as well as startup and logging functionalities. Everything we need for the playout gets loaded dynamically using config files, which allows us to configure, adapt and scale the application quickly. For custom scenarios we can build an “Output”-tox that might implement layers, color corrections, crops, and whatever you need to adapt a content for playout on one or more video outputs. The output type (e.g. windowCOMP or directdisplayoutTOP) can also be specified in the config.
Outputs can also implement multiple layers to create overlays etc. and handle data I/O (e.g. DMX-output via ArtNet or brightness-control via MQTT). This is basically our playground to quickly adapt the system to different needs.
Most API commands (e.g. preloading content) will trigger actions using python scripting, while we went for Touchdesigner OPs for everything time-critical (like fading-in a content). The state of the application (playing and preloaded assets for each layer of each output) can be read via our REST-API, which is not only used for confirming the playout of the correct file in Elixir, but also comes in handy when debugging.
Example 1: Media FaCade Klubhaus St. Pauli
While we were already heavily involved with planning the media façade of the Klubhaus St. Pauli and writing a custom CMS 10 years ago, we are happy how much we could improve the User Experience with Prism when we replaced the old Pandoras Box servers a few weeks ago. Even though we are (right now) only scheduling “one screen” the task appears quite complex: Four different LED-products, each with different pixelpitches, color- and brightness properties have to be matched to a great canvas. One of the products didn’t even take in a videosignal, which made it necessary to implement a pixelmapping over 4 universes. To keep the performance-Impact on the CPU minimal, we went for a solution using a GLSL-Shader to calculate all dmx values and then use a toptoCHOP. Sad that we finished this project just one month before the first build including POPs arrived…
A media facade must look good under various weather conditions and thus we needed to implement some custom brightness mappings. The LED-Controllers are controlled directly by our admin- and automation dashboard for media technology, but the artnet signal had to be dimmed inside of Touchdesigner. We choose MQTT to couple or main and backup servers to the admin-dashboard, which implements custom mapping automations for each LED-Product while reading from one sensor and checking its values against a theoretical max-value.
Content uploaded to Prism via webinterface will go to a central storage (NAS) from where it can be loaded into Touchdesigner. However, our daily users won’t ever interact with Touchdesigner directly but only access the webinterface, which not only makes the system easy to work with and accessible from almost everywhere, but also really reliable, as users don´t really have a chance of breaking anything. And if for whatever reason you SOMEHOW got the system to play a wrong video, it will fix itself in seconds without any manual action required.
By the way: As media art gets mixed up with ads on the Klubhausfassade, we have a proof of play feature to keep track of actual, API-confirmed playouts, not just the calendar-schedule.
EXample 2: Modern Business Foyer
In a modern foyer of a german global business player we faced an interesting challenge: 100+ Displays in the whole building were accompanied by LED-Walls and custom made “Brickwalls”. The artistic approach was to combine the media surfaces to play “the same” content at the same time in an artistic, but not technical sense. To embrace the digital identity and uniformity of large open rooms and the whole building, different media surfaces need to play contents together. With screen formats, orientations and locations widely differing every content consists of different videos with different resolutions, crops and aspect ratios. This would make changing the content a great task, but it was clear that it was a task that had to be done, as no one wanted to even imagine a great and innovative new building playing a boring videoloop 24/7. Our users would want to schedule content without having to think about which videofile goes where. If you schedule “Focus 02” you should see “Focus 02”. If you need a different file for the LED-Wall it should not matter for you while scheduling. However when you upload an asset you probably want to make sure that it can be only assigned to the right media surfaces.
Profiting from the scalability and customizability of Touchdesigner we could even create 16:9 Overlays on the large LED-Surfaces, which come in handy for events etc. as you can show content in a standard aspect ratio on a beautiful background that connects to and follows you through the whole building.
Even dynamic interactions of media surfaces are possible, as we can use selectTOPs to send video data across different outputs. Several videos for the brickwalls for example use the content from the LED-Wall next to it as a base “ambient light” layer. On top of that you can schedule videos with a specific embedded blend mode, which allows for an incredible amount of content combinations and looks and even improves scalability, as the brickwall can adapt to other content easily. This again underlines the approach of thinking media surfaces hollisticly for a whole room where one change reflects across everything to create a coherent overall picture.
The result is a room that can be changed between different modes, like Lounge, Focus, Daily, Party, … that have been curated by a well-known media artist and can be expanded easily without breaking the artistic intent.