Long story short, a few months and a sh*t ton of WORK later - "I put in the hours" he flatly states, emerged a 1.5 hour spectacle centred around a 21-foot-tall x 22-foot-wide, mechanical feat of engineering clad in pulsating, twisted visuals that singularly originates from him and turns E:DM on its head (so to speak ;). Whichever way you want to look at it, Cube V3 is something that hasn't been done before and it is remarkably good.
And as Zimmerman put it to the Miami New Times before the show's premiere at the Ultra Music Festival March 30th "I need everyone to know how much of a fucking insanely stupid technological feat this is… That's important, because we have all this tech and no one's fucking using it."
Before getting into that, one of the many take-aways from having seen this beast come together is that deadmau5 is a truly great role model and from a kind of 'disenfranchised' standpoint that makes it even cooler and a lot more real. Look no further. Massive thanks to the head mau5 for so publicly supporting TouchDesigner and for talking to us here.
Derivative: In the recent Forbes article "Who I Am" list of nine defining moments, #9 was Using TouchDesigner and you talk about how as a huge video gamer you were thinking, "If you can get graphics looking this good using real time in a video game why can't you do that on a LED wall?" So now that you've done that on your CUBE - completely reimagined and then redeveloped your show with realtime visual content and in a realtime environment, with visuals that are "modular" has it been liberating?... is it more 'fun' :)? and has it sparked new ideas or avenues for future development? Seems to be very compatible with the way you play and the way you create the music and audio elements for your shows - as in, construct blocks, strip away elements and then reconstruct everything live during the show…
deadmau5: Making the show in TD in a modular space has been super liberating for me. As a touring performer, especially in the Electronic Dance music space, it's very easy and quick for a show production to get stagnant after a while, and more often than not, changes mean more work than you'd think... changing the music around is easy, but having cohesive content to match the changed content, light programming... all that is hours of work.
Most 'EDM' shows to me, just lack cohesiveness. I mean, generally speaking here, the modern day headliner DJ dude gets up on a stage, pops in a cd, and fist pumps while a "VJ" back at FOH furiously mashes away at video clips to kinda make some kind of connection between the music and the visuals. Sure it's a thing, but there's usually a pretty big disconnect between the DJ and the FOH VJ guy. Not to say that's terrible, it's just like watching and hearing two different shows half the time.
Of course everyone has their own system... and I've seen a few SMPTE tricks here and there... I just wanted to develop my own system. Something modular that we can change up on the fly.
Derivative: So how did TouchDesigner creep into you show at first?
deadmau5: As I've said in other interviews and conversations with a number of game studios... I was really looking for the best RT (real time) environment for doing what I wanted to do with connecting RT visuals to music. Of course there were mentions of using Unity or Unreal... but, to have to build an engine from scratch and being a "game programmer" as opposed to a visual artist seemed too daunting of a task at the time... but then I recalled a few conversations I had with a TD user from way back and decided to pop on over to the website and see what's new / download the non-commercials.
Well, at the time of writing, I literally did my first performance in Aspen here with a lightweight version of our new show system... everything went pretty smooth! Used TD Ableton and a Microsoft Surface as a touch controller on my end, and a laptop with TD connecting to the same TDAbleton tox at FOH all talking to each other nicely. Small little test step compared to what we have in store for the "big cube thing" show... but very stable and promising.
Derivative: Then you came to us with a practical problem where you wanted to get the best quality image on each face of the Cube. Can you explain what happened there and where we ended up? … It's now a general solution in TD's renderer that everyone can benefit from.
deadmau5: Well, at first, I wanted to get comfortable with TD before reaching out... so I took on as much as I could, learned the basics of TD, joined a few user communities and poked around... then I began to sort of "prototype" what I was after. Last thing I wanted to do was to start making phone calls to the office and setting up meetings not knowing anything about the app. That'd be annoying.
So the challenge was... here's a 3 sided cube, the way we do it now is "reverse projection map" content onto this thing by rendering out video on these three quad reprojections from a render point that's from the audience perspective so that it allows it to have a flat look across all the surfaces.
So I brought in a few solutions, mostly involving corner pinning which was "okay" using the anisotropic filtering.... but I wasn't really getting what I knew GL is capable of.... reprojecting cameras. I'm somewhat of a Octane nut, so…I've had an OSL camera shader that did exactly that, but with raytracing. So I showed it to Malcom [Bechard at Derivative] over there, and said, well... can't we bend the cameras to take in a reprojection map like OSL retrojects raycasts? And probably by the end of the week, we had a quad retroject option in the editor.
Derivative: How has TouchDesigner spread to other parts of the show?
deadmau5: TD is infectious, really. As we run into problems, or have protocol debates, it almost always ends up being a TD op solution... just surfing through the operators, even tho there's not hundreds, I feel like I'm always discovering new ways to do things. From other application web API's to... "Oooooohhhh ArtNet in? Hey that MA sends art net right?" So here we go with art net-control from the MA. We even setup a couple of Arduinos that feed copies of the Cube's position data (rotation and tilt) to feed into the quad reproduction so that as the Cube moves, the content moves along with it.
Derivative: Can you take us through all the parts where TouchDesigner is being used?
Booth: Ableton running TDAbleton python script
Booth: Backup of Ableton running TD python and mirroring Surface Studio controls
Booth: Microsoft Surface Studio running TD (as a user interface for performance, coupled with some custom software I've built via OSC and Midi)
FOH: TD running the animations of pixel bars via DMX / Artnet
FOH: TD running Cube and LED wall visuals + receiving data from Cube Position
FOH: TD backup of Cube and LED wall elements + receiving data from Cube Position
Derivative: How do you feel about using, developing, refining shaders in TD?
deadmau5: Working with shaders in TD I don't feel is any better or worse than working with shaders in Sublime text... or notepad. I've always wanted to see someone develop a really intuitive and functional GLSL shader IDE... but no-one really touches on that because... well, people who really know GLSL don't need an IDE... and people that don't know GLSL won't know wtf the IDE is doing... so its a kind of weird spot. But, I'm still hopeful someone will find a way to make a node-based IDE for GLSL shaders that compiles it all into a handy dandy standalone GLSL script for the sake of better instancing and cook times.
Derivative: Does TD feel familiar to you, like does it seem parallel to modular audio synths?
deadmau5: It behaves almost exactly like a modular synthesizer, which is why I feel so at home with it... at least with a few exceptions and a few enhancements... some being "selects" and containers etc. plus the instant feedback is a huge workflow advantage.
Derivative: What parts of the software do you think needs the most work? (assuming we haven't heard anything from you yet) limit it to under 100 things! hah. just a few for now.
deadmau5: MovieFileIn: sometimes I just can't escape pulling in HAPq renders from Octane OR recorded video footage to mix in with RT, while doing a 100% RT show is a really cool prospect, I would still need movie playback from time to time... of course, having said that, we work with a lot of different footage that I convert to HAPq... some 24fps some 30, some 60fps... I wish there was a mode in the MovieFileIn that let you stretch the playback rate of the video to fit a certain number of beats or bars without having to make some awkward kind of playback widget with a retrigger.
I feel like the Animation COMP needs a solid redo, while I feel like TD is supposed to be this crazy free flowing modular beast... there are just some moments in a show that really need to be sequenced, even if for a short amount of time... while you can certainly achieve this with the Animation COMP and a few external triggers... I just feel like working inside the Animation COMP is a little cumbersome.
Instanced Geometry output: Would love to have an output of an instanced geometry node to alembic. We won't get into why, but if you're thinking what I'm thinking, you'll know ;)
Derivative: Talk about Nvidia Flex/Flow.
deadmau5: I've always had a great relationship with Nvidia, they've been super helpful in my creative endeavours with their products in exchange for exactly this, opening the hood on what I do and the technology that powers it. I've talked to a few pals there in the Nvidia Gameworks division about Flex and Flow, and having seen Vincent Houze's port of Flex CHOP, me and some pals began to dive in and make our own "superFlex CHOP" which is pretty scary, but somewhat functional and very cool looking... even at 300k particles at 60fps without breaking a sweat :D (now we know why you didn't publish every parameter in your CHOP Vincent).
Anyhow, I went back to Nvidia with a few prototypes of flex in action, and they were pretty excited, and eventually introduced me to Flow, another sort of high quality RT smoke / fire simulation stuff... I've since introduced Malcom to a few of my nerd friends at Nvid, and spoke with the powers that be and now I'm happy to announce, if I can, that TouchDesigner now officially has license to include these packages with TD.
Derivative: Is there anything you've been able to do that was surprising to you? Or, positive outcomes "aha moments" that led to something new or innovative in your workflow or in the show?
deadmau5: Let's be real here, the fact that I've even gotten this far from inside my head to where we're at now is surprising to me. One of the exciting things I've found is the frequency of these aha moments... I'm finding so many new ways do things so much more efficiently than I would when I started out.
Derivative: What are you most pleased with in the new show?
deadmau5: The most gratifying part of the show to me, is watching the Cube move, and the RT content move with it... it's a really awesome trick! Unfortunately for me.... I'll never see it again, as I'm inside it.
Derivative: How you feel the new show tech will change the audience's experience?
deadmau5: I think they'll get it. I think a better part of my demographic understands what I'm doing, maybe even just a little bit... they pop in on my streams and basically join me on my success and struggles and occasionally explain what I'm doing and why. As for the person who has no idea about any of that, well... I think the difference will be there too... generally, video production at shows isn't 60fps... sometimes maybe... but RT on that scale, is very rare... so, to probably make it even more clear that the system is RT we're going to have to come up with some clever tactics to do some really neat visual things that can easily convey that it was generated realtime... like some kind of audience interaction... I'll figure that out later.