Close
Company Post

MODE Architectural Builds AI-Driven Interactive Content Engine For GM World

In January this year General Motors unveiled a remarkable permanent experiential installation in the Atrium of its global headquarters in Detroit.
Spanning five stories, seventeen massive-scale LED surfaces are fed by an artificial intelligence-driven Interactive Content Engine (ICE) prototyped and programmed with TouchDesigner. Adding to the complexity of deliverables was GM's mandate that the installation be capable of delivering an infinite array of customized visual sequences daily and even hourly to keep the experience novel and topical.

On demand, the system is capable of autonomously sorting through its vast content library to select clips matching the parameters it is fed, then applying a variety of rules to artfully edit clips together creating a live mosaic of the sequences that play out across the 17 screens.

Last month in Las Vegas, MODE Studios Chief Creative Officer Bob Bonniol was presented with the first ever Knight of Illumination USA Disguise Award for Video Content for his contributions to GM World. We spoke to him to learn more about the creative development for the media and interactive features behind this stunning feat of problem-solving, engineering and design.

Derivative: For GM World, what did the project brief entail? 

Bob Bonniol: The basis for the installation was exceptionally large in scope: Install 17 screens, with a combined raster of over 28,000 pixels wide by more than 8000 pixels high, and create compelling content to provide location, spectacle and messaging. Even more difficult, the client mandate: GM needed the installation to serve a different experience weekly, daily, and hourly. The specific ask was to incorporate interactive variability sampling weather data, traffic data, calendar rules, etc., to construct a software system capable of thoughtfully producing custom sequences every day to always keep the experience fresh and topical. The GM World space is General Motor's permanent experiential voice to the world. It is also an important gathering place for the community that surrounds it. The installation had to be able to reflect brand values and messages, but also create surprise, delight, and wonder for the people of Detroit who would gather there. All of this, ultimately, was realized in our use of TouchDesigner to create an expert system known as the Interactive Content Engine (ICE).

Derivative: Describe the creative solution you are responsible for that visitors to GM World can now experience (please include all parties involved in conceptualizing and putting your solution into place).

Bob Bonniol: I was Co-Creative Director on the development of the overall project to create the space, and then served as Creative Director and Executive Producer for all media and interactive elements. This included video systems, audio systems, video and audio content, and integration of audio, video, and lighting through the supervision of show control programming. MODE was working closely with the design team at Ant Farm to develop the Interactive Content Engine, and the content authoring and shooting. Ultimately, under MODE's creative direction, Ant Farm produced eight spectacular CG sequences called Marquee Moments, which function a bit like a clock, creating a whole space-immersive sequence every 15 minutes. Additionally, Ant Farm shot over 150 hours of footage that highlighted GM's design, community, innovation, and environmental brand pillars.

Bob Bonniol: In our initial discussion of possible strategies to create a system that would keep the installation constantly evolving, I showed Ant Farm's Chief Creative Officer Rob Toy the power and flexibility of Derivative's TouchDesigner. Rob was immediately impressed with the power and flexibility that we could deploy with TouchDesigner as a creative coding environment that would allow us to make a system that needed to function outside the normal capabilities of off-the-shelf media serving solutions. Derivative put us in touch with Keith Lostracco, who then led the coding effort for the ICE.

Derivative: With a focus on how TouchDesigner was integral, please discuss how your solution was implemented.

Bob Bonniol: Keith worked hand-in-hand with Ant Farm to develop an engine which would use an extensive rules matrix, as well as interactive inputs to create custom sequences daily. The ICE sampled from a large pool of GM media that continues to grow daily. The content in this media pool is all meta-tagged extensively, allowing the system to understand a lot of detail. Car models, trim packages, the season in the shot, presence of family, specific locations, etc. This is combined with interactive variables, as well as client inputs such as what the current marketing focus is, or if there are any special brand or model events. But it's not enough for the software to just put together correctly contexted clips - we had to 'teach' it how to assemble compositions thoughtfully. Some clips look great on a particular screen (the screens have different aspect ratios), but not as good on others. Some clips look great spanning several screens, while others do not. It was a process of trial and error to put the systems in place to let the ICE make these decisions autonomously.

Derivative: Any additional details on the use of TouchDesigner and/or insider tips to look for where use of our technologies especially shine through?

Bob Bonniol: In addition to the amazing power and flexibility that TouchDesigner gave us in defining the FUNCTION of the system, it also made it intuitive to create a UI for its continued use. Obviously this was something we had to hand off to a client who needed the system to be stable and straightforward to use - and to grow with. Keith did an amazing job of both structuring the system and making it usable and malleable. I'm not sure how we would have created such a powerful and flexible system without TouchDesigner!

Comments