Close
Company Post

Rush 30th Anniversary Tour

RUSH PUTS DERIVATIVE's SECRET TOUCH ON 30th ANNIVERSARY TOUR

Derivative produced custom live visuals for the Rush 30-year anniversary tour this summer. Rush played 55 shows in North America, the UK, Germany and other European cities.

Touch synths for 12 songs were created by Touch artists Ben Voigt, Farah Yusuf and Greg Hermanovic. Derivative brought on Markus Heckmann to travel with the band and VJ for the tour.

Heckmann performs the visuals live, alternating between two Alienware Area-51m Extreme laptops, each running TouchMixer with a MotorMix MIDI slider/button box and an Oxygen8 MIDI keyboard. The high-resolution image from TouchMixer is fed from an Area-51m to the triple-width LED screen behind the band.

DERIVATIVE VISUAL SOFTWARE

Derivative makes Touch Tools, the most powerful animation software for authoring and performing live visuals for a wide range of users, including installation artists, architects, stage designers, VJs, researchers, web 3D developers, educators and anyone who wishes to play and manipulate live interactive 3D visuals. Derivative's production division is available to companies who wish to tap Derivative's expertise for special projects.

RUSH 2004 SONGS AND INTERVIEW WITH VJ MARKUS HECKMANN

by Ted Miller

Between the Wheels

Ted: Markus, give me a little background about you and Derivative.

Markus: I’m a freelancer and I started as an intern at Derivative while I was a student in Germany. I was selected to be the VJ for the current 45-city Rush tour. Derivative makes Touch, which is an authoring tool for creating real-time art. Derivative gives artists who create and perform live visuals an open and flexible real-time animation tool without forcing them to do any programming. Touch is the main tool used in the tour.

Subdivisions

TM: What is a Touch synth and who is Touch made for?

MH: Touch is the name of Derivative's family of three products used to author and perform live visuals. They are used by installation artists, VJs, architects, stage designers, researchers, web 3D developers, educators and are designed for people who play and manipulate live, interactive 2D and 3D visuals.

A Touch “synth” is a 3D animation rendered in real-time (each image is generated in 1/30 second) that can be performed live or edited while it is running. Synths can include deforming 3D models and video clips as textures. With Touch the process of creation never ends since you are altering the look and feel of your animation while performing. Touch allows you to control every parameter of the animation, giving you limitless combinations.

For the Rush tour we use two of the three Touch products – The design team used TouchDesigner for authoring the real-time animations (synths). During the shows I use TouchMixer for performing the synths with my input devices. TouchMixer is an enhanced version of TouchPlayer, the free synth player.

In the Rush show we have one synth (artwork) for each of the 12 songs we perform live. A song’s synth contains all the textures, 3D shapes, pre-keyframed triggerable motion, control panels, lighting setups and presets that I need for that song. That’s all prepared beforehand using TouchDesigner. We carefully set up the scenes for each section of a song so that I can press a preset button and jump to those settings. But we leave some parameters open that I will improvise live on the MIDI slider/button box, mouse or Oxygen8 music keyboard controller.

Bravado

TM: How do you prepare the visuals of the Rush Tour?

MH: We started with recent versions of songs that the band was rehearsing for the tour. The four Touch artists creating the synths met with the band and the lighting director, Howard Ungerleider to hear some initial ideas from them, like what do the songs bring to mind visually. The nice thing here is that the band’s music and lyrics have abstract underpinnings, and their taste carries through into the style of the visuals. This gives the Touch synth designers a great degree of freedom in choosing how to complement the music. We also presented some new animation techniques and recent independent works-in-progress to see if they could be adapted into the show.

There is a bit of hands-on with Touch by the band and lighting director. In our frequent meetings they sometimes take control of TouchMixer and perform the synths themselves to share ideas of how it could be performed in the show.

To create each synth, the Touch artists start in TouchDesigner by creating basic 2D and 3D geometric objects and assigning materials and textures to them. So far this is the regular workflow in a 3D environment. This is the point where the real-time part comes in.

Summertime Blues

There are several ways we animate in Touch. Keyframing is the most common in 3D, but in Touch we build libraries of keyframed animation and in the performance we trigger, blend and time-stretch them. More often we leave some parameters attached to sliders so we can completely perform them live. Some movement is mathematically produced with CHOPs (channel operators), using waveforms like sine waves, pulses, strobes and noise functions. All these motion types can be combined and further recorded in “segments”, which can be played back or triggered later, so you can have recordings of recordings. We do all of these things in the Rush show.

Next we optimize the Touch synths to meet our target of 30 frames per second. You can put any amount of objects, motion, simulation, textures and movies in a Touch synth, so we spend time optimizing it to run as fast as possible. The Alienwares make this a lot easier as they are highly-tuned high-performance machines – right out of the box.

Once we hit rehearsals, we have the stage lighting, the band playing, the pyro, lasers to integrate with, so the whole stage team goes through each part of each song, tuning up the total look.

Mystic Rhythms

TM: What are you performing live and what is prepared?

MH: All the twelve synths for the twelve songs are performed live. Each song has different things I am performing live. While some synths have presets to instantly change the graphics at a certain cue points, others are performed completely free on the MIDI keyboard and slider boxes.

For some songs we are using pre-animated camera movements, which is like an effect drop-in. The nice thing about that is, that although the move might be pre-programmed, every thing else stays interactive. There is no point where I would have to wait for a move to finish before I can actually adjust parameters - I can always tweak or change the look.

Some synths are more pre-scripted than others. In the song 2112, for example, the movement of the UFOs as they fly into the foreground is pre-animated. I just choose when to initiate that movement. When beaming up the heads of the three band members, the beams are tracking Geddy’s and Alex’s movement on the stage because I have one slider for Geddy’s horizontal stage location and one for Alex’s. I follow them around so the virtual beams land behind them. You can’t do that with pre-edited video clips.

Red Sector

TM: A What kind of capacity and speed do you need from your mobile computers? And in what way are you pushing the CPU and graphics cards?

MH: The show setup is a 3 GHz Alienware 51m with 1 GB RAM, a Geforce FX 5200 with 128 MB Memory and a fast 5200 RPM 80GB Hard drive. We use the second monitor out exclusively for the image that goes to the stage. All my on-screen control panels and a preview image stay on the main screen (the Area-51 LCD screen).

What do I need? Whatever you can give me! Well, we don’t need excessive disk or CPU RAM, as Touch generates most of its images on the fly.

But we depend heavily on CPU performance and graphics performance and the amount of RAM on the graphics card. Some synths are CPU-hungry since they do a lot of simulation before drawing graphics. Some are texture-heavy and have up to 18 transparent layers of images being rendered in 1/30 second, which was unthinkable 5 years ago, even on workstations. Other synths use 2+ streams of video coming from disk, which pushes hard on the disk I/O and pathway to the graphics RAM.

Red Barchetta

TM: What are the Alienware systems being used for?

MH: I use three Alienware Area 51m’s for the twelve Touch songs in the show. From song to song, I alternate between two Area 51m’s. For back-to-back songs it is necessary to have two Area 51m’s running, so a Touch synth is ready to go while I am performing on the other Area 51m. Switching between songs becomes fast as I don’t have to worry about start-up times of synths that have a lot of texture maps.

The third Alienware serves as a backup system during the show. It contains video clips of our visuals for each song, recorded into QuickTime movies. In case our MIDI gear malfunctions or we lose one of the other Area 51m’s, we can trigger these video clips during a song, though of course it’s not interactive.

Secret Touch

TM: What are the challenges of each of the 12 songs you are performing?

MH: The biggest challenge is finding the balance between having not enough live controls versus having too many things to control in a song. You can get overwhelmed if you have to operate too many controls live. You want to isolate and use the controls that give the most expression in a song. You can make and use presets to help with this. As the tour progresses, I perform some songs more live as I get better at it. I do the song, Xanadu completely live now. Other songs become less live to ease my workload while I’m performing.

Another challenge is catching the vibe of the songs and reflect that in your performance with Touch. It's not just about beat-matching, it's a lot about the mood shifts of the song. You have to try to put that into the performance and make people feel like: “uh, that's not just a screen saver, no it actually fits the music and it’s actually interpreting it”. While doing so, it's still important to watch the rest of the stage lighting. The result should be a performance that seems to be out of one feeling.

When you are developing visuals on a high-resolution color monitor, it is difficult to imagine what it will look like on stage with the multi-section LED screen, which has a lower resolution and highly-visible pixels, and is partially occluded by smoke and lighting. Besides, though LED screens are extremely bright, unlike film you get poor contrast in dark images, so we’re always fighting that limitation – you can’t be too subtle.

Vila Strangeato 2112

Photos provided by Andrew McNaughton, Kevin Hughes

Rush tour lighting design + lasers provided by Howard Ungerleider of Production Design International

///

Learn more about RUSH at rush.com

Comments