The purpose of TouchEngine is to allow people to tightly integrate all of the functionality of TouchDesigner and projects made with TouchDesigner into a larger system or framework. Many companies have playback systems that they already use in-house. TouchEngine allows those playback systems to be easily augmented with all of the functionality of TouchDesigner by running .tox (TouchDesigner component) files via the TouchEngine API.
The existing Engine COMP in TouchDesigner is an example of the TouchEngine API. This allows TouchDesigner to be augmented by other instances of "headless" TouchDesigner running on the same machine. With the Engine COMP the host application is TouchDesigner itself. It spawns off TouchEngine processes and runs them as sub-processes of TouchDesigner.
Where OSC and Spout were functional solutions of using TouchDesigner with another piece of software, TouchEngine greatly streamlines the whole process. With the TouchEngine API the host application has more control. It can start, cook, monitor, reload, terminate and restart the TouchEngine process. Secondly it can duplicate, parameter for parameter, the user's .tox component in a host application’s user interface. Thirdly the input/output CHOP, TOP and DAT data streams of your .tox component are set up automatically for you between the host and the Engine so you can more directly link with their counterparts in the host software.
You can learn more and download the TouchEngine API here.
Let’s look at Front Pictures’ experience in their implementation of TouchEngine for Screenberry.
Derivative: Please give us a little background about Front Pictures and the work you do.
Front Pictures: With 40 people on board, Front Pictures is a team of digital artists, creative technology experts, and software developers. The company was founded in 2003 as a 3D graphics and visualization studio. In 2005, we got our first project in show business. The goal was to create content for a record-breaking 60x8m screen at the Djuice Nokia Video Drive party. That project has become a turning point for us. After seeing our visuals on such a huge canvas, we knew there was no turning back. No conventional screens anymore!
From that moment on, our focus shifted to stage graphics production, interactive installations, and live show control for concerts, parties, and corporate events. And it was a perfect moment. The experiential marketing was booming, clients were looking for new forms of entertainment and were eager to experiment with new media technologies. As our expertise extended beyond stage graphics to include software development and AV engineering, our company evolved into a creative lab for some of the most incredible projects
Among the most memorable works we'd like to mention are a multimedia gallery with a record-breaking 52 projectors connected to just one server; a unique Rain Dance performance at America’s Got Talent 2015, stage graphics for Jamala’s winning performance at the Eurovision Song Contest 2016, and a projection system for South Korea’s first spherical projection theater Space 360.
Recently, we teamed up with Red Rabbit Entertainment and Profi Innovations to create The Escape stage performance, which seamlessly blends the real and virtual worlds. The show went viral, scoring over 100 million views online. We named this project Antigravity Show and traveled the world with it. The performance at CES 2020 was our last stop before the pandemic hit.
We are also the co-organizers of the Kyiv Lights Festival, an international festival of light and media arts, which was founded in 2017.
Another area of our expertise is dome theaters and planetariums. Back in 2010, we started to work with fulldome projection and caused a small revolution in that market with our single server ideology and development of precise auto-calibration. Today, our technology is at the heart of more than 200 dome systems worldwide.
Since 2007, we have been developing our Screenberry media server as a universal show production tool that has absorbed technologies which we have developed over the years. In 2017 we launched it as a stand-alone product.
Now we are thrilled to announce the release of Screenberry version 2.8.0 with TouchEngine integration. The demo version is available for download.
Derivative: Can you tell us about your impetus for making Screenberry and talk about its capabilities?
Front Pictures: Back in 2005, when we started our journey into the world of shows and events, we discovered that most of the existing media servers were cluster-based systems using a ‘one computer per one output device’ approach. In many cases, this resulted in bulky and hard to configure solutions, especially when it came to multi-display setups.
We had a very different vision of what a media server should be able to do. We needed a powerful, and at the same time, compact and convenient tool for our events and installations. So we had no choice but to develop it ourselves.
We had the first version of Screenberry in 2007, and by 2009 the software was completely redesigned, inspired by our ‘single server ideology’ and driven by real-life experience.
Since then, we’ve made thousands of events and installations using our Screenberry platform, which has evolved organically with each new project. We mostly used a small form-factor server that could be taken on a plane as hand luggage to support an event anywhere in the world the next day. Such mobility provided a real advantage, enabling us to take on projects that would otherwise be impossible due to time constraints
As the media server was being developed by the same people who were operating it, we were able to add new features quickly, understanding exactly what we needed in terms of real-world solutions and results.
In 2010 we started getting more projects that required mapping on complex surfaces like domes using multi-projector setups. Initially, we considered using manual projection alignment but soon realised that it would be too complicated and too time-consuming. Based on our previous experience in computer vision, we developed our own camera-based automatic calibration system, which is now an integral part of Screenberry.
Our work over the years has gained the attention of other media artists who have begun asking us to use Screenberry in their shows and installations. This encouraged us in 2017 to transform our in-house technology into a product which is now available to the entire industry.
Derivative: What are some of Screenberry's key advantages?
Front Pictures: Screenberry is famous for its single-server ultra-high resolution playback. Depending on the configuration, you can playback video up to 16K x 16K @ 30 fps on a single machine. At the heart of this performance is a hybrid graphics engine that enables the most optimal and efficient use of the hardware available.
Obviously your projects will not always require such extremely high resolutions, but having this extra performance margin means that you can use more video layers, more processing, more effects, and can also feed more signals into the system. This media pipeline bandwidth capacity margin also plays an important role when you do more advanced compositing.
Screenberry has a rich toolset for manual and automatic projection calibration allowing operators to warp and blend projections onto flat, curved, and dome-shaped screens, panoramas, buildings, and other complex shapes and objects.
Our auto-calibration normally takes only 10-15 minutes to align projectors on complex surfaces with high precision.The system is designed to be flexible and intuitive, allowing operators to quickly re-calibrate the projection when the number of projectors and their positions have changed. This feature is extremely beneficial when setup time is limited.
Screenberry is also a multi-platform media server. It supports Windows and Linux, and can be run on virtually any x86-compatible hardware – from microcomputers to powerful top-notch workstations capable of serving dozens or even hundreds of output devices.
There is a built-in DRM system for playing encrypted content. This feature is very important for commercial theatres and planetariums.
Screenberry is an open platform for the integration of different visual technologies such as Notch, Unreal Engine, Unity 3D, GLSL Shaders, etc. And now via TouchEngine support, we are happy to natively integrate TouchDesigner — the most powerful and flexible real-time graphics tool used by thousands of visual artists and creative technicians around the world.
Derivative: What was your first exposure to TouchDesigner?
Front Pictures: We started using TouchDesigner back in 2009 — it was love at first sight. In our opinion, there are no other tools that could help media artists and technologists prototype and implement even the most advanced ideas so quickly! Over the years with the help of TouchDesigner, we have created multiple media installations, used it for real time concert graphics and content production. We have also developed a few stand-alone interactive applications with TouchDesigner at the core like Meduza360 and Presenter360, which are used by dome theatres around the world.
Derivative: How is TouchEngine improving your workflow?
Front Pictures: Previously we could run Screenberry and TouchDesigner on separate machines and grab the signal via a capture card or via NDI. This approach caused latency and, sometimes, reduced quality.
Another option was to run Screenberry and TouchDesigner on one machine, but this required shutting down one app to launch another one. In this way we could not use Screenberry functionality while TouchDesigner was running and vice versa. Also, each time we needed to re-calibrate complex projection (e.g., in a dome), we had to export calibration maps from Screenberry to TouchDesigner.
Now with TouchEngine you can run TouchDesigner projects right inside Screenberry with maximum efficiency and minimum latency. We have integrated TouchDesigner as a TouchEngine node, and Screenberry can now run TouchDesigner files natively.
From now on it will be possible to use render and data outputs of the TouchEngine node in the Screenberry pipeline, feed Screenberry content into the TouchEngine node, and even pass it through the TouchEngine node for processing. The number of combinations and scenarios is virtually unlimited!
For example, you can apply a projectors’ calibration maps and transformations to TouchDesigner output on the fly. TouchDesigner render outputs can be added as media items to playlists, placed on timelines together with other media, cross faded, and scheduled along with other events.
This is essential for virtual production studios to get in-camera visual effects when real time graphics are combined with camera tracking within an LED volume. And again the number of possibilities is virtually unlimited. You can combine and sync TouchDesigner, Notch, Unreal, and scientific visualization apps etc.
This perfectly fits into our “single server ideology” where one machine can output video to multiple devices without getting into complicated clustered systems.
Derivative: How easy was it to integrate?
Front Pictures: The thorough documentation and detailed comments describing the internal mechanics of library functions were very helpful and so it was a relatively easy task for our team.
Key functions such as TouchEngine instance creation, TOX file loading, parameter value retrieval, etc. feature a descriptive error logging system that gives easy access to errors. Thus, it is easy and fast to troubleshoot internal errors.
The API has convenient methods for getting and setting parameters. This makes it easy to create a parameter, set its value, and configure additional options (e.g. define extend before/after behavior in CHOPs), as well as send the parameter.
Succession is a mechanism that allows you to create a hierarchy of objects and to define the relationships between them. All variables coming from TouchEngine are descendants of the same base class, with similar management, which gives a clear understanding of the hierarchy of objects and the relationships between them.
It is also worth noting the library's implementation of an async callback system facilitating the processing of parameter changes and other events by the library.
Derivative: Is it making things easier for the end user and how so?
Front Pictures: Absolutely! As Screenberry has a high-level node graph under the hood, TouchEngine is integrated into it as a node. A TouchEngine node can have TOPs and CHOPs inputs/outputs depending on the TouchDesigner project setup. DATs support is coming shortly to Screenberry 2.8.1. We have tried to provide users with the simplest and most intuitive pipeline possible. So all you have to do is create a TouchEngine node, select a TOX file with your TouchDesigner project, and provide the names of its TOPs/CHOPs inputs and outputs to expose them as node pins. Just a few clicks and it works!
Video streams obtained from TouchEngine can be used for displaying as a texture, or as a composition layer. Incoming data streams can be used to provide Screenberry with information like tracking coordinates, object properties, DMX values, touch events, and any other real-time data.
It is also easy to send a video stream from Screenberry nodes to TouchEngine for processing and to get the output with the result back. The same logic applies for numerical data.
Let's take a look at a basic project done in TouchDesigner to test the features and operation of TouchEngine in Screenberry. It was loaded in Screenberry and went through 3 weeks of non-stop operation:)
In our opinion TouchEngine integration is a turning point, as the symbiosis of SB and TouchDesigner combines the strong sides of both software: low-level node graph with the generative capabilities of TouchDesigner and the high-level paradigm of Screenberry with features like automatic projection calibration, a flexible media player, the timeline, and more.