Login | Register 
FEATURES
APPLICATIONS
DOWNLOADS
EDUCATION
BLOG
WIKI
FORUM
STORE


s


  MEETING OF THE MINDS - SCOTT PAGANO / ACHIM KERN / JöRG UNTERBERG

s
 
z
 


When I heard Scott Pagano was using his custom-made TouchDesigner 3D / 2D mixer at TED Talks and then at the Berghein in Berlin with Speedy J, I summoned two of our most prolific and helpful TouchDesigner users Achim Kern and Jörg Unterberg to hook up with Scott and get some insights into Pagano’s approach, means and methods from the point of view of TouchDesigner users. Here’s what transpired.

Greg Hermanovic, Derivative

s


MEETING OF THE MINDS

by Achim Kern

July 30, 2009 marked the second big deployment of Derivative’s amazing TouchDesigner software in Germany. Initially featured at an audiovisual marathon at club Transmediale back in February, this time filmmaker, motion designer, and spatial reconstructionist Scott Pagano used TouchDesigner for his and Jochem Paap’s (Speedy J) amazing Umfeld project at the notorious Berghain club in Berlin. Being the only audio-visual performance that night, the audience was even more mesmerized by the 3 hour performance featuring Scott’s abstract artworks which, together with the outstanding capabilities of TouchDesigner, again pushed the boundaries of audio-visual composition.

Their set was a variant of the highly recommended 5.1 surround sound Umfeld DVD, featuring a memorable mix of Jochem’s electrifying live audio set and Scott’s inspiring creation of high contrast environments - always pulsing between dysfunction, futurism and sheer beauty. His meticulously constructed abstract artworks blended in seamlessly with the unique architecture of the Berghain, which is located inside an old power plant in former East Berlin. It’s the perfect place for a project which was born in the industrial districts of Rotterdam.

Using TouchDesigner allowed Scott to switch from his previously large setup, consisting of a mix of computers and hardware video mixers, to a single laptop/controller combo.

derivative 

On top of that, TouchDesigner’s raw power not only allowed him to completely switch his videomixer to HD in TouchDesigner, but also provided enough headroom for an additional 3D render engine running in parallel to the videomixer.

Rendering at 720p, the 3D engine produces “a series of nano robotic sculptural objects that animate in some really cool ways”, and the visual quality was so high that they looked like they had been pre-produced and mastered just like the other video clips in his set.

The Berghain does not permit cameras inside the club, so there’s no video for you to enjoy his set in action. However Scott has kindly provided us with a little TouchDesigner synth which we will post shortly. In the meantime, here is an early screenshot of the performing interface:

derivative
[Future link to Pagano exe file]

A week before, on July 22, Scott had the opportunity to play with jazz pianist Eric Lewis at the TED Global late night event in Oxford. The video link is below, meanwhile and Scott will give us some impressions of this rare constellation of live visuals and eccentric jazz in the following interview.

Which tools did you previously use and how was the transition to TouchDesigner?

Previously to TouchDesigner my live performance system was based around a (relatively primitive) Max/Jitter video playback sampler.  I ran two computers that for the most part did little more than play back pre-rendered video loops that were mixed via an Edirol V-4 video mixer.  This style of system was chosen due to the limitations of the tools at my disposal and I made the decision to focus on presenting the highest quality of content I could.  The special thing about this system is that the entire thing - bank/clip selection on both computers and the hardware video mixer could be controlled via one Wiimote.  This made for a fun and physical way to deploy my imagery and I will be implementing this method of control in my live performance Touch system as well.

Transitioning to Touch was a massive leap for me.  I had been subscribing to the mantra of if-it-works-don't-change-it for a while and was apprehensive about deploying a completely new system in an important performance context.  So far my results have completely convinced me of the stability of TouchDesigner. 

If my system can handle the Berghain where temperatures are running hot and the insane Funktion One sound system is causing every component to vibrate like mad - I'm sold.

I had tinkered with Touch at a few points over the past few years, but had never made the commitment to really dive in.  When the 077 FTE version came out I decided to finally give Touch a real shot.  None of this would be possible for me without the support of Derivative and the forum community who have been tremendously generous with their time to mentor me and assist with tool building and optimization.  The small community is a passionate one and has been essential to helping me cross a lot of hurdles in my TouchDesigner development process.

derivative

Can you tell us a little bit about your current setup, especially how you integrated TouchDesigner into your workflow?

My live performance system has been converted to an all-TouchDesigner setup now.  I run my video mixer / real-time 3D system on a PC laptop (Sager NP8662) and use an NI Maschine and a Korg nanoKontrol for parameter manipulation.  I use an M-Audio Midisport 2x2 to receive external MIDI input from a collaborators system.  My pre-rendered content library (720p H.264 files) all runs off of an external SSD connected via eSATA.  

So after having used TouchDesigner now for a couple of months, is there any feature or workflow you enjoy most?

It is difficult to narrow in on a specific feature - but the overall workflow is something I am enjoying more and more the longer I use it.  I have a background working in both Houdini and Max/Jitter so I came into this with an understanding of the paradigm of procedural node based systems. 

Having such a seamless integration of 2D and 3D in a real-time environment is something I have wanted for a long time.  It is a good feeling to now feel limited by my experience and knowledge rather than the tools.

Are there any areas where the use of TouchDesigner opened up new possibilities?

Previously to using Touch the real-time 3D toolkit at my disposal was incredibly limited. Either I needed to have a much deeper understanding of OpenGL on a low level or just could simply not get results anywhere near what I deemed acceptable.  Being able to both re-create a more powerful-higher resolution version of my existing video mixing system while adding in an audio/midi reactive 3D system was a significant step for me.  My goal is to transition the live work to be more and more all-real-time and in the meantime it is fantastic to be able to blend seamlessly between my 2D and 3D systems as I learn and develop skills in this area.

derivative

Do you make use of audio analysis to procedurally animate your real-time work or is there "only" some basic tempo sync information coming from the musician's laptop?

My current system is designed to receive MIDI or OSC depending on who I am working with.  When I work with Speedy J we have a special timing ramp of MIDI notes in his Ableton Live setup that allow my system to sync with his.  For other shows I use Max/MSP patches that process MIDI and audio input and send data to Touch via OSC.

Many of my linear video pieces contain sections where animation data was generated in Houdini via audio analysis and I plan on both porting and building on the same techniques in Touch.  I have found that to really create strong and effective sound/image relationships a bit of data manipulation has to be done in CHOPs.  I need to go through the R&D of fine tuning these processes in Touch to generate relationships that are up to par with my overall aesthetic.

Can you give us a little overview of the European Umfeld premiere at the Berghain club? Do the visuals you performed there differ from those on the Umfeld DVD?

Jochem Paap (Speedy J) and I have worked on a range of projects together.  The name "Umfeld" in its truest form refers to the main piece on our 2007 DVD release "Umfeld".  We have also taken it on as a moniker for our live performance work. Our live shows can take many forms - from a live remixed version of that hour long abstract piece to more free-form audiovisual improvisations that are usually more dance floor oriented. The DJ/VJ shows are comprised of a wider range of imagery, all still based in the same world of graphic exploration of both photographic elements derived from architecture and nature and highly synthetic organic-robotic abstract forms.

Currently real-time visuals are mostly featured in an electronic music context, but last week you had the chance to perform with an experimental jazz pianist at TED. Can you tell us a little bit about the performance?

Pagano TED Talks

The majority of my collaborators are indeed electronic musicians.  This spans the gamut from spatial soundscapes to hard pounding techno - but all the while still inhabiting the realm of electronic music.  My collaboration at the TED Global event in Oxford with pianist Eric Lewis was a special opportunity to mix it up a bit.  I worked with another collaborator of mine Dr. Barry Moon who built a Max/MSP patch that processed the MIDI note data from Eric's playing and sent a data set over to Touch via OSC. It was quite an aesthetic contrast overlapping Eric’s passionate improvisational piano playing with my highly-futuristic Geiger-esque mechanical structures, but it worked out well.

Doing real-time gfx is always about compromise. You're currently using a lot of high-quality pre-rendered movies, but would you sacrifice some of the visual quality for more control over the actual animation, i.e. render/animate everything as 3D objects in real-time...

The amazing thing is that when I was going through my archive of material and re-rendering video loops for the HD video mixer - I kept coming across clips where I thought "you know what - I should just re-create this as a real-time scene in Touch”. The small amount of visual compromise I may encounter due to anti-aliasing issues or the limits of my shading knowledge are far outweighed by what I will gain.  The ability to add such synchronized live articulations is something I am excited to explore.  With real-time ambient occlusion, reflection maps, and the ability to apply my traditional 2D image sweeting methods, I have run into several situations where I feel like I am not sacrificing visual quality at all in comparison to an offline software render of the same scene. Being able to apply my existing 3D and 2D experience directly into a real-time system has been quite exciting. This is just the beginning and I am looking forward to creating a whole new realm of work combining my aesthetic sensibilities with the power of Touch.

Thanks for the interview...

You are very welcome :)

After the interview, we (Scott Pagano, Jörg Unterberg and I) hung out at Scott’s place, discussing TouchDesigner, presenting our projects, exchanging ideas and recovering from the insanity at the Berghain. First up was Scott, explaining the inner workings of his videomixer and the 3D engine.



s

 

previzart - Jörg Unterberg

PreVisArt

Then Jörg showcased his combo of the ARToolkit and TouchDesigner, which enables anyone, including virtual computer illiterates, to intuitively create animated 3D pre-visualizations. Instead of interfacing with the computer via mouse and keyboard, his augmented reality approach allows the users to interactively place and animate virtual objects by using a set of markers and recording their movement, as can best be seen in this video.


previzart explanatory / demo video from previzart on Vimeo.

Needless to say, this is loads of fun, and if all goes well, he’ll have a new version of the application available quite soon. (Jorg uses a TouchDesigner Pro-version feature in his application, so only those of you with a FTE-Commercial or a Pro-license will be able to try it out. )



s

 

UI Builder for 3D Presentations - Achim Kern

Then I gave Scott and Jörg a quick introduction to the application and the underlying framework and UI libraries I’m currently developing. It’s a system designed for creating movie-like 3D-presentations, but as it’s not quite finished, I’ll show a screenshot of the UI here:


derivative 

Finally I showcased the lyrics-file-reader and text-destructor featured in this video:

Small Talk at 55118 II from Achim Kern on Vimeo.

and then gave a quick overview of a file inspired by Markus Heckmann’s amazing use of TouchDesigner’s UI as an integral, if not the main part of the whole visual performance for Raster Noton’s Alava Noto. For this synth I only used user interface elements of TouchDesigner, so everything you’ll see in this video is derived from two CHOP nodes being scaled up and down by a script. The “network view” (Network Component) is converted to TOPs with the help of an OP Viewer TOP and then processed in a simple TOP network.  In the following video you can see the “source images” on the left side.

UI Synth from Achim Kern on Vimeo.


After a very long time of utilitarian “TouchDesigner Programming” for my presentation system, these purely-for-pleasure side projects were a welcome change. Despite being unpolished and just the result of playing around, I was once again amazed how Touch allows me to create content without the use of any external images or videos - even in 2D without any procedural geometry. Here’s the file for those who want to take it apart.

[Download UIsynth.toe] (The download is one .zip containing UIsynth.toe which you can run in any TouchDesigner on a suitable PC: TouchDesigner Download page. Once UIsynth is running, press Esc to see inside.)

This concludes the first meeting of the minds, big props go to Scott and Jörg, it was great fun hanging with you and to Derivative for supporting this and for being just plain all-around awesome.

Achim Kern

Derivative http://www.derivative.ca

s

Download
Store
Like Us
Follow Us
Watch Us
Blog
Forum
Wiki
Education
Privacy
Terms of Use
Contact Us