Close
Company Post

ARKAM'S LIVE VISUAL FEAT for GOOGLE I/O's AFTERPARTY

We recently wrote about TouchDesigner making a rather spectacular appearance at Google I/O 2012 in Bot & Dolly’s remarkable Kinetisphere, an interactive installation (with robots) designed to celebrate the launch of the Nexus Q.

We were doubly excited a few days later to get word - and video and images! - from Mark Wells and Adam Jenkins of Arkam who as it turned out, used TouchDesigner to design, build and perform a spectacular, elaborate and hugely FUN 90 minute real-time interactive visual set for the Google I/O afterparty.

When Google came calling, Mark and Adam were excited by both the prospect and the context (Google I/O being all about awe-inspiring technology and hands-on experiences) and so challenged themselves to create something “beyond the realm of traditional animation.” What started off as a job producing a few minutes of traditional motion graphics work evolved into the delivery of real-time motion graphics that broke the 2-3 minute mold and could stay dynamic for hours if need be.

Naturally we were very impressed and had a few more questions - about the project, Mark and Adams backgrounds, their practice, motivations, working with TouchDesigner and so on, which Mark and Adam kindly answered. The following is an account of our conversation and of the fantastic visual experience Arkam created for the Google I/O afterparty.

Mark Wells: Adam and I have been friends for over 26 years, and this is our first professional collaboration. Adam's background is in motion graphics for TV and electronic billboards. This was his first experience with TouchDesigner. He uses a lot of different programs but mainly After Effects so he felt right at home with TOPs. My background is modeling, rigging, and animation in Houdini and other programs. I have used TouchDesigner on past projects such as Turtle Talk With Crush, Stitch Encounter, and Monster's Inc. Laugh Floor. During those projects I sometimes worked elbow to elbow with Greg, Jarrett, Rob, and Ben. I learned a lot from those guys, and definitely have the utmost respect for the whole team at Derivative and the products you create.

Derivative: What has your experience been in adopting and working with TouchDesigner - Mark with your Houdini animation background and Adam coming from motion graphics?

Mark: Adapting from Houdini to TouchDesigner wasn’t too hard for me, but there isn’t a direct one-to-one relationship. Some problems are easier to solve in TouchDesigner than Houdini and the opposite is also true. Recently I find myself thinking in terms of TouchDesigner. There are times when I’ve looked for that CHOP I wanted, only to realize that it doesn’t exist in Houdini. I also design networks in Houdini based on how I’d build them in TouchDesigner just in case I want to port them over. Unfortunately, the most efficient way to create a network can be different between Houdini and TouchDesigner. To its own credit, TouchDesigner is an unique program. It is very powerful for the the artist, extremely rewarding, and worth the learning curve.

Adam: This was my first time working with TouchDesigner, and it was quite a learning experience. With Mark’s expertise and the great video tutorials on the Derivative Wiki page I was able to get up to speed rather quickly. Because of my motion graphics background I easily adapted to the TOPs and was able to create composite networks and trigger them in real-time. I developed many of the design elements using After Effects and Cinema 4D as my sketchbook. Mark and I then examined the design elements and figured out which ones made sense to rebuild in TouchDesigner therein giving us real-time control over all of its parameters.

Arkam: Google originally hired us to do a few minutes of traditional motion graphics work. After brainstorming about the job, Adam and I figured we could deliver real-time motion graphics that could break the 2-3 minute mold and stay dynamic for hours if needed.

It just felt natural to use TouchDesigner. We built a bank of elements both 2d and 3d, and used a number of UI sliders and buttons to drive those elements. The project’s variables were pretty massive. We didn’t have a playlist, any idea what the DJ would be spinning, or know how long our content would be on screen. We were told our screen time could be anywhere from 1-4 hours. Building the project in TouchDesigner was the perfect solution because it gave us the ability on the fly to manipulate our graphics, sync to the music, and to react to the atmosphere of the party.

From start to finish of the project we found ourselves asking, “Can we do that?”. Here are some of the questions that we asked ourselves while working on the Google I/O project.

  • Can we complete this project in real-time up to our design standards?
  • Can we push five 2K backgrounds used in multiple elements in real-time?
  • Can we add two real-time 3D hero characters?
  • Can we create a 3D switching technique for the background?
  • Can we drive the visuals with the audio?
  • Can we manipulate the hero character and backgrounds remotely on Android tablets and let party goers control the visuals?

In TouchDesigner, the answer proved to be, “Yes, we can do all that”. Again, the Derivative wiki, forum, and tutorials proved to be extremely helpful.

To get into some of the specifics of the project we created a few different traditional motion graphic 2D elements (background and foreground movies), 3D assets (speaker station, the animated Android character inside the speaker station, Google Chrome ball, and a background we called CubeEQ), and logic systems (audio driven and tablet driven). It's a personal goal of mine to find ways to push characters in TouchDesigner. In this project the Google speaker station and the Google Android character is completely animated inside TouchDesigner with real-time translation data, offset sine waves, and lag. By building the Android's motion completely in TouchDesigner we were easily able to drive his enthusiasm during the party to help give him character. For the time constraint of the project and simplicity of the Android character, I think our approach worked out perfectly.

It was unintentional, but while building out the Cube Equalizer (CubeEQ) element we quickly realized it was becoming a character of its own. The user could make the background dance and sway to the music and provided us with a way to transition between backgrounds which made the CubeEQ one of the most fun elements to drive during the show.

Mark: The work we delivered for the After Hours Party was well received and we’re currently in talks with Google for upcoming projects. Since the event the response from the TouchDesigner community has been overwhelming. The diversity of the software is extremely inspiring, and we have a growing list of ideas of how we’d like to incorporate TouchDesigner into current and future projects. Thank you again for taking the time to interview us. As you mentioned, the TouchDesigner community is in an exciting place right now, and we are happy to be a part of the amazing body of work being created.

We are very happy too and look forward to seeing more from Arkam!

Comments