"Using TouchDesigner, a real-time node-based 3D and visual composition software, we created a virtual "M" that can represent the sequenced light patterns exactly as our physical "M" can, with the added ability to display video on the face of the "M". By compositing the 3D generative "M" with fully rendered scenes and camera movement created by our "Metropolis" teaser artist Scott Pagano, we created an engine that allows us to zoom in and out of the visual world displayed on the face of the "M" in real time.
Using video to create visual sequences adds an incredibly powerful new dimension to our live show. In this example, I've created a video remix of the music video for our track "Tiny Anthem", re-envisioning the video in the context of Shinichi Osawa's frantic remix of the track. After chopping our music video into numerous clips, I load them into Resolume Avenue and "sequence" them into a cohesive narrative using Ableton via OSC. By automating clip speed, video effects, and masking in Ableton, I can create synchronized video mixes to expressively represent the audio in nearly limitless ways. And by mapping the video onto the face of our virtual "M" in TouchDesigner, I can zoom in and out of the "M" in real time, creating a truly immersive visual experience that more perfectly represents our sound and approach to pushing the boundaries of creative technology as far as we can."
See also: The M Machine | Nest HQ