Here we look at the our creative approach and some of the technical challenges faced while working on a production of this scale and complexity.
NVIDIA AND DERIVATIVE
Rush approached us in April 2002 to contribute to the visuals for their upcoming tour. Among the band members, it was primarily Geddy Lee that was driving the look of the show, though Alex Lifeson made sure we didn't get too serious. The lighting design was orchestrated by show director Howard Ungerleider of Production Design International, a veteran Rush collaborator who has worked with the band for the past 29 years. Toronto’s Spin Productions had also made several videos for past Rush tours, so together we formed a four-way team.
From the start it was clear that 3D animation with Touch would be a synergistic fit with the rich pallette of Rush's music, lighting, lasers and pyrotechnics. Geddy was especially intrigued by some of the abstract visuals that we designed and performed for the electronic dance music scene, and he was ready to channel them into the band’s vision of their show. We undertook the task of maintaining the visuals' abstract nature, while modifying them to better match the themes, lyrics and rock-music format of Rush’s songs.
Geddy Lee explains why he chose Touch for the Vapor Trails tour:
"I'm in love with this concept, it's an innovative new technology that allows us to create a visual environment for our music that "evolves" as the tour progresses. Meaning, the more the operator understands the nuances of our music the more he can improvise and re-shape the visuals, both rhythmically and creatively, making the show different every night!"
A team of four animators developed the live 3D visual synthesizers, and Los Angeles-based artist James Ellis of Secret Sauce was chosen to head out on the road with Rush and perform the visuals live. The Vapor Trails tour is Rush’s first in five years and took the band through more than 40 cities including 50,000-fan shows in Mexico and Brazil.
Touch visuals are a potent accompaniment for music that is performed live. While Touch animations can be very structured to precisely match the cues and particular sections of the music, they are also open enough to allow for improvisation within that controlled structure, meaning that there are visual variations from night to night when a particular animation is performed alongside a song. And unlike edited video, the Rush visuals are continually being revised and improved between all shows.
The Touch synths were displayed on a 40 by 15 foot LED screen behind the band, an aspect ratio of 8x3 – twice as wide as a traditional TV screen. To accommodate the widescreen projections, we built the Rush synths with an image resolution of 1200 x 450 pixels on our Dell laptops. The laptop's 1280 x 1024 images were then fed to a Folsom ViewMax scan converter where they were cropped and squeezed into a conventional composite video signal. The video image was stretched back to 8x3 at the LED screen.
James Ellis’ road kit included two high-performance 3D-enabled Dell/NVIDIA laptops that were both loaded with the 11 synths and are running a boosted TouchMixer version 009. Two laptops were required so that it takes no more than a second to switch between synths/songs. For backup and reference purposes we recorded the entire visual performance to DVD. In an emergency, we could switch over to a prior DVD recording, but this would only happen if a laptop were to completely break down during a show (which didn't occur), and even then it would be a temporary measure since a third backup laptop can be swapped in in roughly five minutes.
Two separate MIDI control devices were attached to each Touch laptop: a slider/button box and a 25-key musical keyboard. For the slider/button box we chose the ‘MotorMix’ from CM Labs (approx. $800 US). What we like most about this controller is its tight integration with the Touch software. Its eight sturdy servo-driven sliders are kept perfectly in sync with the software’s on-screen sliders, meaning that you can use either set of faders and always know that the position of the real and virtual sliders will match, even when playing back pre-recorded gestures. The MotorMix switches seamlessly between up to 8 banks of 8 slider controls.
We often found it useful to divide a song into up to 16 parts, and jump to these parts using the MotorMix’s preset buttons. However, even with this number of possible triggers we found it necessary to add a second MIDI device for certain animation sequences, like the camera shakes and fluid character movement in the song ‘Leave That Thing Alone’. For this further control we added the MidiMan Oxygen 8 25-key musical keyboard – in our opinion, nothing beats the ergonomically-refined feel of striking real keys on an actual musical keyboard. Check the gear layout and gear pic below.
The number of controllable parameters available to a Touch artist can be mind-boggling. In Jim’s case, he’s controlling everything from 3D movement, shapes, characters, illumination, to virtual cameras, light strobing and image blurs.
A key question for us has been how much control is too much? At what point does the performer get overwhelmed by the options before them, causing the performance to suffer?
We found that pre-recording certain moves and presets adds a margin of safety to a performance. Certain portions of a synth are triggered by a one-button push, leaving the artist free to manipulate key animation elements over top of pre-built sequences. The option for a manual override is always present: Should the musicians depart from their expected path, the performer is able to seize full control and fluidly fall into step with the new direction.
Touch has the broadest range of control of any 3D visual creation tool on the market today. We’re often asked if our visuals are driven by the actual audio signal itself. Although Touch can track audio signals, for our work, we choose not to use the music's audio signal. We find that the ears/mind of the Touch performer are much more responsive - they are doing the interpreting and anticipating, versus a simple machine algorithm's reaction to audio levels.
The Touch architecture allows for four different approaches to animation performance. Synths can be animated via keyframing, where objects and characters are posed at certain times and the computer in-betweens them. A second approach is to perform the synth live in the studio and record the gestures, which are then played back during the show at different speeds and with different overrides applied.
Alternately, an artist may choose to jam in a live, free-form fashion with a just few automated parameters like beat-tracking. Lastly, there is the option to render a pre-performed synth out as a QuickTime movie and play it back as a linear video clip. All these approaches were combined in the Rush project.
Below are screen shots from TouchMixer of the Rush synths for Leave that Thing Alone, Natural Science and Red Sector A.
RED SECTOR A - PRODUCTION NOTES
The shadow characters in Red Sector A were designed by Paul Simspson of Realise Studio and is based on a section of the movie, Baraka featuring the Balinese monkey-chant ritual.
The movement of Red Sector A's characters were pre-keyframed in 2-4 second repeating loops. Referring to Baraka, we keyframed 15 distinct moves and assigned them to 15 MotorMix buttons. By pressing the buttons we cross-dissolve between the 15 moves, allowing the characters to be choreographed to the music. One MIDI slider adjusts the overall speed of the characters to match the song's variable tempo. Another set of 16 buttons cut between 16 cameras, and each camera can be moved between two positions with a slider.
We composed 40-second sections of the song by recording the button presses and speed control. These 40-second sections were assigned to 8 other cue buttons on the MotorMix corresponding to 8 sections of the song. During the show, the VJ can press a cue button to start a section, then override the controls, adjust lighting or articulate further nuances on top.
The red backgrounds were QuickTime movies that were pre-performed and pre-rendered from another synth derived from Mordka's Vrek. Despite the complexity of what you see, it's being performed live and each show is different.
NATURAL SCIENCE - PRODUCTION NOTES
Since the Natural Science synth was built from the ground up, it works with each of the song’s cues in mind. The synth was designed so the VJ can simply hit each button in succession on the MIDI controller. There are 4 main sections for Natural Science with a total of 16 cue points. The first section consists of a cloudy ambient scene, for which the VJ has a few extra controls for color, brightness and speed.
The second section consists of some DNA strands and a bright sunburst. The third cue initiates a strobe effect on the sunburst scene that is used to accentuate a particular element in the song, and lasts for only seconds.
The fourth cue initiates the tunnel sequence. During this sequence the VJ has a few controls. One of particular interest is a blue warping effect that happens when Geddy Lee’s voice is also warping due to some sound effects. Other controls include intensity, speed and amplitude of the tunnel curves.
The fifth cue fades to a solid color customizable by the VJ. The sixth cue initiates the geometric kaleidoscope sequence which consists of 10 different sequences that can be re-sequenced by the VJ live.
Overall, the Natural Science synth is a great example of a tight cue-based synth that provides a minimal amount of control to the VJ allowing for added expression, but keeps the scene changes locked to a predetermined path. Playing this one is like riding a surfboard that you can't fall off from.
GHOST RIDER - PRODUCTION NOTES
A song from Rush’s new album Vapor Trails, ‘Ghost Rider’ draws part of its inspiration from a year-long motorcycle trip that drummer Neil Peart recently took. The visuals for this song was a good example of how a Touch synth can be performed in the studio, rendered out to a QuickTime movie, and edited into a video clip which is played behind the band.
Norm Stangl of Spin Productions said,
"Though Touch is touted as a performance tool for live shows, I see it as a valuable tool for post-production. For Ghost Rider, Farah Yusuf took 20 still images and gave them a nice controllable soft motion in a TouchDesigner synth.
She performed it to sections of the band's rehearsal takes of Ghost Rider. But then we OpenGL-shaded the synth (rendered) it to QuickTime, and edited it in with some live footage we shot in the desert. We divided the result into four sections that are triggered from a Catalyst by Whole Hog lighting cues and played during four different parts of the song."
LEAVE THAT THING ALONE - PRODUCTION NOTES
This song features two animated characters, Zoop and his girl Shoop. The two characters and their backdrop are manipulated by dozens of performed controls. The controls on each of the two the characters include their position and rotation, their 5 facial expressions, as well as the intensity of their bobbing (Zoop and Shoop are squashed/stretched to the beat). The backgrounds are also controllable: two sets of 6 deforming textures are mapped onto 6 sides of a cube. There are also lighting controls (color and strobing), and camera controls like camera shakes and TV zaps. In total, this synth had 33 controls.
We quickly realized that 33 controls is beyond the limit of one performer, so we decided to pre-perform some of the controls in layers (1-6 controls per layer), and then leave some controls open to be performed during the show. The height of head bobbing, for instance, was pre-performed to follow the intensity of the music over the course of the song.
The last part of the song brings on a proliferation of baby Zoops and Shoops. 50 characters with dozens of controls each was prohibitive, and besides would not have been displayable at 30 frames per second.
To multiply the babies, our head of R&D, Rob Bairos developed a novel technique where we render the baby characters in the NVIDIA graphics card off-screen at 256 x 256 pixels each, and keep the last 30 drawn images of each baby in graphics memory.
Then on-screen we draw the babies on 50+ rectangles, randomly choosing from this 1-second time-history of the two baby characters, giving the impression they are all different. This off-screen recursive-rendering technique is released with Touch 012, opening up a family of new realtime effects.
Derivative’s collaboration with Rush pushed animation technology to a new level. Howard Ungerleider, one of the world's top show lighting designers with 30 years of designing and programming heavily-sequenced lighting set-ups, gave us technical and artistic insight in lighting-animation synchronization.
Rush has always taken an innovative approach to their lighting and visuals. Derivative's founder, Greg Hermanovic points out:
"For the 2002 Vapor Trails tour, Rush tapped into Touch's state-of-the-art capabilities. While developing visuals, Touch gives you more creative choices - once the visuals are designed for a song, we could easily play the controls and experiment with more variations with the band, settling on what's best.
During the show, the live element makes the visuals more expressive and enhances the feeling of the band's music. And the visual performer keeps in step with the music each time a song is played live.
While Rush is touring, the visuals continue to evolve under their creative direction. Existing visuals can be cross-bred and taken in a new direction without starting all over. From an economic standpoint, this was an efficient way to get the most content under a tight production deadline.
Our goal with the Touch products is to make live visuals easy to produce, easy to perform, easy to evolve and easy to combine with other tools. Doing challenging projects like the Vapor Trails tour sets our sights on what tomorrow's visual producer will need before they start asking for it."
Learn more about RUSH at rush.com