Produced by Sydney-based Mod Productions in collaboration with the ACO, ACO VIRTUAL invites audiences to step inside a 360-degree cinematic experience of a performance where fifteen ACO musicians appear as projections around the space, their respective sound heard from the direction of their image.
A companion touch-screen app on a tablet in the centre of the space enables the audience to "spotlight sound and visuals" to highlight a musician, a section of instruments or their own mix of players. Visitors can also choose to display or hide the musical score overlay for each musician. A free mobile app extends the experience with visuals and data about each musician, their instruments, the music and composers and even tracks of the featured music.
In a nutshell, here's a rundown from Derivative's Ben Voigt who worked closely with Mod, on how ACO VIRTUAL uses the advances in TouchDesigner 088 to set a new baseline for high-performance video delivery on a single 12-core computer with two Nvidia K5000 Quadros:
There are up to 13 musicians per piece, the output resolution is 960x1080 for each musician. The video for each musician is a mix between a couple of videos (spotlight, normal, silhouette), so a total of 26 960x1080 video streams may be getting composited at any one time. The musical score is layered on top of each musician in realtime to aid in the education of students and musicians.
8 simultaneous 1920x1080 video streams are output from the Quadros. The show also has a 3D stereoscopic mode for some venues.
Mod Productions built a custom mobile app that allows viewers to remix the show and focus on their own points of interest. It communicates with TouchDesigner via OSC to control and run the show, make a musician solo, pick the instruments they are interested in hearing, change the piece of music and fade-in the score layer.
A Python back-end show management system built into the TouchDesigner ACO VIRTUAL application allows the show to run and log remotely, as well as download and update different configurations remotely for various venues and show requirements.
ACO VIRTUAL is a remarkable and complex achievement that evolved over 5 years of extensive R&D, literally riding the bleeding edge of available technologies and the combining thereof. Michela Ledwidge Artist and Director of ACO VIRTUAL and co-founder of Mod Productions graciously provides us below with a detailed, step-by-step overview of the making of, thank-you Michela!
Michela: This is our third production shipped using TouchDesigner but the first with 088. The inclusion of python has been a game-changer, allowing us to re-use parts of our production tool chain and web apps.
The inception of the project was a week of research I did to look at new distribution opportunities for the ACO. We started prototyping the concept with the Unity game engine and moved to TouchDesigner as it became clear we needed more video playback capabilities than Unity provided. We were fortunate to land a couple of smaller gigs in the lead-up to ACO VIRTUAL which gave us the opportunity to evaluate the OPs (Operators) we would need for this show. I'd been tracking TouchDesigner for many years looking for the right moment to dive into it.
One of the biggest challenges for the production was maintaining sync - blending the performances recorded on a gigantic sound stage at Fox Studios and in the real-time performance engine. For the former, sound designer Simon Lear (bsound) brilliantly mastered a set of recordings. He had to deal with the speed of sound - the clap on-set being recorded at noticeably different points depending which mic you listened to. He then built a real-time audio processing system that can be tuned for each venue and respond to remix cues passed from our tablet 'controller stand' to TouchDesigner and then to Plogue Bidule via OSC.
With invaluable support and guidance from Ben Voigt at Derivative, I spent almost six months refining the show starting with an extremely rudimentary experience prototype that could manage 2FPS and ending up with a cluster of three TouchDesigner instances that made heavy use of the Pro-only Sync CHOP to deliver an interactive video running around 30FPS that our orchestra partners were happy with as a representation of their performance.
Having multiple TouchDesigner instances on a show working in tandem to deliver large-scale visuals and high-end audio was a revelation. The project involved learning many new features of TouchDesigner and feeding back beta test experiences to the Derivative team. Months were spent performance tuning. To hit our target frame rate of 25FPS with up to 39 videos composited at a time (on the Roger Smalley piece - 2 videos for each of 13 musicians, plus 13 scrolling scores) we relied on the GPU Affinity feature to bind a separate TouchDesigner instance to each graphics card and very careful mapping of cables to displays to ensure that load was spread equally across the two Quadros.
The project goals of producing a tour-able show made everything more challenging. So far we've staged ACO VIRTUAL in four venues so developing effective bump-in/remote support/and bump out processes has been critical. We made heavy use of git (version control) to manage not just the source code and project files but also wiki documentation to record troubleshooting and show setup procedures which have so far been a life saver re. some of the finer points of how we've used TouchDesigner - e.g. "don't simply drag and drop Window displays without reference" to "what graphics cards do what"!
To streamline and de-risk TouchDesigner development we opted to push a number of functions outside of that environment. The bulk of aesthetic rendering decisions that could have potentially been done real-time in TouchDesigner with shaders were handled by a fairly traditional non-real-time VFX render pipeline in parallel (Fusion & Lightwave for the musicians, After Effects for the animated score). We also shunted all audio DSP processing to a separate machine so that the audio crew could work in parallel tuning the mixes to each venue. This helped get the end-to-end system up and running on schedule so we had space to refine and polish the experience before launch.
I was acutely aware that our specification had not been tested before (e.g. the number of synched audio and video files) so we planned for worst case performance and it was a huge relief when we exceeded the performance benchmarks required. No doubt version two will take further advantage of real-time capabilities.
The new python script workflow has been of great value. I was pleasantly surprised how straightforward it was to incorporate 3rd party python modules. This sped up our workflow in several areas. For example the ACO VIRTUAL data model, including mappings between musician, the musical parts they play and their position on screen, was made available to the team via a web CMS. TouchDesigner instances, the tablet controller, and the mobile companion apps for iOS and Android all had API access to the same show control web app running on our production cloud platform Rack&Pin. This made editorial updates and layout configuration a doddle. It can be fun zooming around TouchDesigner networks with a scroll wheel but auto-configuration via JSON feeds no doubt saved me some repetitive strain injury. As of February 2014 we will have two ACO VIRTUAL shows on tour at the same time all supported by our platform.
All-in-all it was a great experience working with TouchDesigner and I couldn't recommend more highly the support we received from Derivative. Having shipped our virtual tour system we're now looking forward to create new interactive experiences and storytelling opportunities that you can mess with on-the-fly.
More here at Mod Productions