Day For Night in Houston last December was on multiple fronts one of the most ambitious and flawless festivals we've attended - take Aphex Twin's transformational live set for one, but the opportunity to experience such a formidable array of installation light and kinetic art in that particular venue (a staggering 1.2 million square foot decommissioned postal sorting station) does top the list; and that almost half of these installations made such excellent use of TouchDesigner was icing on the proverbial cake. In many ways it was like being at a gigantic TouchDesigner meetup (we can dream) but the running count of community members on site was up to 30.
We've had a chance to let it soak in and to review the experience with the brilliant artist teams who showed at DFN and with the formidable Alex Czetwertynski who was responsible for curating this massive scale and mind boggling visual program. There's a TON of very interesting reading below so without further ado here is our TouchDown Houston at Day For Night Festival report!
DAY FOR NIGHT CURATOR | Alexandre Czetwertynski
Alex Czetwertynski is a digital artist working in the fields of creative technology and media arts. Born in Belgium and based in Brooklyn, his practice consists of large-scale immersive experiences and media sculptures for which he designs motion and physical presence. He collaborated with artists such as Doug Aitken, Orlan, threeASFOUR and Jessica Mitrani, served as a digital creative lead for tours of musicians such as Stevie Wonder, The Strokes and Beyoncé. He currently curates the visual arts at the Day For Night festival in Houston, and is an artist in residence at Mana Contemporary in New Jersey.
Derivative: Congratulations on a brilliant festival Alex, it hasn't stopped resonating! Can you talk a bit about the beginnings - what sparked the concept of Day For Night?
Alexandre Czetwertynski: When we started Day For Night in 2015, we knew there was a void to fill in the music festival industry. We were looking to create a new kind of experience, that combined a highly attractive musical lineup with a strong roster of digital artists. An important premise was the idea that they be billed equally. In our first year that meant having Kendrick Lamar and New Order on the same level as Casey Reas and Zach Lieberman, and in our second year, Aphex Twin and Bjork on the same level as UVA or Tundra.
The great advantage of doing this, is that we have been allowed to expose people who would probably never have heard of these visual artists because they bought tickets to see the aforementioned musicians. And that is something that I find to be a truly extraordinary opportunity, seen from both sides of this equation. We are creating a true mix of crowds, and encouraging people from different worlds to discover things that might live far from their typical field of vision.
Derivative: With so much to choose from what defined your curatorial principals?
Alexandre Czetwertynski: When it came to deciding what kind of art works we wanted to show, a key element was to avoid pure eye candy. There is a lot of that in the "new media"/digital art world, installations that exist just because they look "cool". But our interest was to create a deeper engagement than temporary aesthetic satisfaction. All the art we show has the ability to somehow encourage a thought process, maybe not a rational/explicit one, but somehow engage with our time, touch something in our mental landscapes. Most importantly, we shy away from work that is done purely for the sake of accomplishing some kind of tech feat, and never promote artworks based on the technology for itself.
There are a few guiding principles to the curation: we look for site specific, large scale, ideally never seen pieces, that allow for interaction in some shape or form, and have a sonic component. We also are interested in combining artists who have been in the field for a while (UVA, Golan Levin, ...) with people who are establishing themselves (Ezra Miller, Gabriel Pulecio, Mark Eats, ...).
One of the great pleasures of the curatorial process has been to collaborate with people whose work I admire, and put together a group that I know I would never see elsewhere. There is a certain magic to combining artists of this level of talent, seeing these artworks come to life individually, and then to observe the whole as this unique moment of time.
Derivative: We couldn't help but notice that there was a significant showing of TouchDesigner-driven installations in the program!
Alexandre Czetwertynski: It probably isn't a coincidence that there are so many TouchDesigner artists at DFN. In 2015 we had Markus Heckmann, AV&C + Houzé, Gabriel Pulecio all using Touch. In 2016, it was VT Pro, Tundra, AV&C + Houze, Ezra Miller and Daniel Schaeffer. Both years we had Touch artists create our stage LED screen control software (Elburz Sorkhabi from nVoid in 2015, Colin Hoy in 2016).
The community is full of extremely talented artists who create work that spans the fields of generative content, lighting or laser control, audio-reactive installations, and more. To me that is probably the main reason why Day For Night and the TouchDesigner community so naturally attract each other. TouchDesigner is one of the few pieces of software I know that allows for seamless translation between types of data, channels to pixels, pixels to audio, audio to 3d geometry, and more.
Working in this world where no data type is an absolute currency requires a certain type of mind, a way of thinking. I find that the artists who chose Touch as their tool of choice tend to have an insatiable curiosity, and an ability to naturally think across disciplines. That is the kind of spirit we celebrate and encourage at Day For Night, a fluidity of thought where architecture, space, sound, light, image and human body become a huge reservoir for potential, an endless playground.
PHASES | AV&C + Houzé
AV&C + Vincent Houzé rejoined forces in 2016 to create Phases - a new kinetic light and sound installation premiered at the Day for Night festival December 17th & 18th, 2016 in Houston, TX.
AV&C is a New York based experience design and technology studio recognized for digital landmarks in the physical world. AV&C creates tightly integrated and responsive architectural interventions for brands, architects, and artists. Vincent Houzé uses modern computer graphics techniques to create interactive art, performances, and large-scale multimedia installations. His practice centers on dynamic simulations and systems in which simple rules give rise to complexity, richness, and realistic motion.
Exploring light as a sculptural medium, phases is composed of robotic mirrors that scatter a kaleidoscope of beams along with a generative and spatialized music score. The 21 motorized mirrors are arranged in a triangular assembly, and mounted overhead in the center of a 30' diameter circular space with scrim walls.
Derivative: Phases was a beautiful and also seemingly very complex installation with a lot of moving parts. Can you explain how it worked?
AV&C + Houzé: It was complex, but everything working together definitely helped create a coherent and seamless whole! In this piece, TouchDesigner controls the choreography of the motorized mirrors and the projected visuals via various generative algorithms, creating illusions based on what aspects of the piece are moving. In turn the visual and physical motions trigger part of the generative soundtrack, creating a tightly synchronized sensory experience. Additional layers in the system create random suspended time moments accompanied by strobing lights, adding breath to the flow of the piece.
The projected light patterns travel as visible beams through the hazed environment and are folded by the controlled mirror surfaces. Precise control over the mirrors allows us to focus all the light beams on a single point, or dramatically sweep patterns 360 degrees across the scrim walls of the room.
Derivative: The piece was very effective in the space itself. How did you go about designing the movement of the mirrors and lights before the actual set up at Day For Night?
AV&C + Houzé: Early on, TouchDesigner was used to build a previsualization of how the light bounced on the mirror surfaces. That proved to be very useful to start defining the rhythm and feel and iterate quickly on ideas, especially before the physical installation came to life.
GLSL shaders were used for simple raytracing*: calculating light rays reflections on the mirrors and their intersections on the circular scrim surface. Since shaders are executed in parallel it made it very fast to evaluate a large number of rays for a good accuracy with real time feedback. It was then straightforward to also use an Oculus Rift to get a better sense of speed and scale. Among the effects unaccounted for in the previsualization that came as a great surprise was the pleasure of looking directly into the mirrors!
[*Here's a simplified toe file kindly shared by Vincent showing the technique]
CHOP networks in TouchDesigner were controlling the mirror choreography, which was sent with the UDPDAT to Arduino boards equipped with Ethernet shields, which in turn addressed the motor controllers. In parallel the CHOPs generated triggers that were sent using MIDI to a separate computer running Ableton Live.
Derivative: Visually and also the sound aspect seemed to be impactful when experiencing the piece proximately within the 'scrim' and also at a distance. Was that planned?
AV&C + Houzé: We really wanted the installation to engage viewers from several vantage points. At a distance, the beams travel well beyond the circular space, extending hundreds of feet into the surrounding volume. On the scrim walls, the projected patterns resolve out of the beams and are revealed as coherent imagery that evokes physics phenomena. Viewers inside the scrim perimeter are able to look up into the mirror array. We found that directly experiencing this light-folding object messes with the eye's expectations of where light is originating from and traveling to.
The live, evolving generative soundtrack is guided by the interplay of the mirror movements and projected animation from TouchDesigner. A cycling array of low frequency phrases, echo laden notes and stochastic percussion create a mechanical and precise soundtrack tightly tied to the choreography of the sculpture. The sounds are spatialized throughout the installation environment by means of a multi-channel audio system driven by equal panning software that allows precise placement of elements within the 360 degree sound field.
BARDO | Michael Fullman | VT PRO Design
Creative Director at VT Pro Design, Michael Fullman's applied interests lie in advanced media, projection design, interactive technologies and the kind of visual experiences that augment live performance. VT is a dynamic full service creative design company specializing in advanced design, execution, and system interaction solutions that range from projection mapping and media management to lighting design and advanced interactive technology.
Bardo is an interactive installation that uses sophisticated digital and mechanical design to explore the presence and absence of light by tracking objects through space. Bardo was developed for Day for Night.
Derivative: Michael, it seems that every time we walked by Bardo that it was full of people interacting with the light and with each other... for long periods of time! What inspired this installation?
Michael Fullman: Bardo was inspired by the desire to create a space that is both the absence and presence of geometric light in a location that was specific and architectural. What we really wanted to achieve with the installation was to create this space and invite an audience to truly interact with it and the physicality of light.
Derivative: Can you tell us how it was designed and how TouchDesigner was used in the process?
Michael Fullman: Using TouchDesigner we created a 3D representation of all the lights in the space and used infrared scanning to capture people as they moved around the space. Touch was able to capture the XY coordinate of users on the ground plane and then translate that information to the lighting instruments. Then TouchDesigner sent art net information into a Grand MA II that would merge the data and then send it out to all the lights, The Grand MA allowed us a higher level of fine control and accuracy and also controlled the non-interactive time coded light shows that took place.
Derivative: The VT team must have been thrilled with the results?!
Michael Fullman: When we finally saw the piece live with an audience it was pretty amazing. We didn't expect to get nearly the size of audience that we had always in the piece. Towards the beginning of the days people got more one on one with the piece itself, really pushing and testing the interactive audio and lighting, then as the day went on more and more people would congregate and hang out inside the piece. Constantly we were seeing people reaching out and trying to touch the light, it was really inspiring to watch how people interacted with it.
More BARDO at VT Pro Design
OUTLINES | TUNDRA | Large-scale laser-beam grid installation
St. Petersburg-based Tundra define themselves as a "collaborative artistic collective" whose members include musicians, sound engineers, programmers and visual artists. Their focus is to create "spaces and experiences by making sound, visuals and emotions work together" in audiovisual performances and interactive installations. Most of Tundra’s projects have been premiered, reviewed and highly acclaimed by high profile media. There's also a 2014 article about the collective on the Derivative Blog.
The installation was originally created for OUTLINE festival in Moscow (July, 2016) but unfortunately remained unseen by the public due to cancellation of the festival. In response the piece was named "Outlines" as a tribute. In Russian this word can mean "stepping outside of imaginary boundaries".
Derivative: It was exciting to finally see this piece at Day For Night after the disappointment of Outlines Festival being cancelled last summer in Moscow. How did you adapt Outlines for this venue?
Tundra: In Houston we premiered this installation to the public and the venue dictated how we positioned the lasers, so it turned into a tunnel iteration of the original Moscow setup. However, we had to split the tunnel into two sections (because of the fire marshal's request to have access to an emergency exit) which actually turned into a really nice feature because all the visitors could literally walk through and stand "inside" this tunnel and have a nice experience.
All visuals and sound were programmed and recorded from scratch at the site during the setting up of the installation. Of course we had lots of studio pre-recorded modular sounds and light-controlling algorithms from the previous original Outlines project but it simply didn't work at this particular venue. So we used the natural acoustics of the space as an instrument, mixed organic with industrial sound-design and avoided any musical scales like minor or major that might impact the visitors' perceptions and associations. Lasers were triggering the sound via custom programmed controlling algorithms.
Derivative: Can you explain a little bit the making of Outlines and how TouchDesigner was used?
Tundra: There were about 260 lasers and each was controlled individually via DMX. We used TouchDesigner in a two-way sync with Ableton Live via OSC and MIDI. TouchDesigner was randomly triggering scenes in Ableton and controlled effects depending on the visual pattern. We also built a type of audiovisual looper for making glitchy effects.
There were about 15 different patterns controlling light and sound, mixed and played in different order. On top of that we created one-shot algorithms controlling strobes triggered by randomly sliced "noisy-clicking" sounds which appeared chaotically. Altogether, a big sound system and a large scale tunnel grid of lasers created the constantly changing living mechanism of Outlines.
To control lasers separately we used a pixel mapping technique, where each laser dimmer was linked to the corresponding texture's pixel. We also built in a TouchDesigner special adjustment app to easily assign a laser's dimmer to the right pixel or texture. So all the visual content was generated in real-time with standard TouchDesigner TOPs.
We used our custom MaxforLive device in Ableton to control visual parameters in TouchDesigner - each scene had its own automation for those parameters. Also TouchDesigner switched scenes randomly (not truly randomly, but with controlled randomness, where we could put some random ranges for the duration and sequence order). We also linked some basic parameters of visuals to sound effects in Ableton for the duration.
STREAM | Ezra Miller
Derivative: Ezra this was the biggest screen by far we have ever experienced, how does one approach that amount of public pixels?!
Ezra Miller: The festival was an amazing experience, and to be able to utilize a canvas as large as the wall I had for my projection was both humbling and terrifying!
As it was really a site-specific installation the inspiration came from the idea of the Houston site, and also from the way image textures were used. From the start I had to keep in mind that there would be a live stream and that my wall would be directly across from the stage. The form and silhouette of the performer became abstracted but still discernible, and the camera feed picking up the lights that were synchronized to the music ended up turning the piece into a pseudo-audio-reactive work.
Derivative: The live feed you were getting from the stage and your incredibly beautiful output looked nothing alike. Can you explain your process?
Ezra Miller: My piece was created using a network of GLSL shaders. The driving engine of the piece was a camera feed taken from the Red stage that was a stationary camera trained on the performer, that allowed for the calculation of optical flow using the frame differencing technique. From there, the optical flow is used to mix in different image textures that I've taken over the course of the last few months living in New York.
The more motion there is in the incoming video, the more textures and colors get mixed into the network. This dynamic textural composition is fed into a feedback loop that performs a reaction-diffusion operation using the optical flow from the video feed to control the movement and direction of the feedback. It oscillates between mixing in the textural composition with the reaction diffusion system so that the image textures are legible. Showing just the reaction diffusion system results in moments of pure abstraction and color.
Derivative: What was your experience migrating from WebGL to TouchDesigner for the show?
Ezra Miller: Essentially I just used TOPs in an attempt to simulate the stuff I do in WebGL using mostly fragment shaders.
TouchDesigner saves a ton of time because you don't have to code out individual frame buffers and stuff, so it was really great to use for this project. I would really love to do more installations using TouchDesigner as its a fantastic piece of software, and one that really lends itself to experimentation and expression. Without all the overhead I normally deal with in WebGL I'm able to get really sophisticated networks of shaders built in a fraction of the time which allows me more time to be creative.
OCTA | Daniel Schaeffer | James Templeton: music | Eric Todd: design
Daniel Schaeffer is a multimedia artist from Houston, Texas. He uses tracking/sensing technology and large format displays in tandem to allow people to interact naturally with their environment. "I see no boundary between lighting and video in my designs, using lights as graphic displays, and video screens as lights."
Derivative: Daniel, OCTA is a very interactive installation that has a performative mode as well – what was your inspiration and what came first?
Daniel Schaeffer: The Central Tenet of OCTA is that it is an audiovisual performance that is held in an interactive installation.
Derivative: Can you describe the process? How the Installation and Performance aspects differed and how TouchDesigner was used?
Daniel Schaeffer: During the Installation portion people interact with a version of the lighting and musical effects from the performance using an overhead of twelve Kinect sensors. 4 computers running TouchDesigner were used to run all the Kinects, 3 computers send their Kinect views to the master over Touch Out TOPs. On the master the 9 incoming Kinect views are composited along with the the 3 native Kinect views into one large view of the whole room with x and y overlapping zones which are blended.
This is then cropped and analyzed into individual trigger zones for each light. The trigger channels, and mode change channel are sent over OSC to Ableton/Max.
During the Performance the data flow is reversed and Ableton/Max triggers TouchDesigner to move through a series of cues, each of which contains programming for one or more instruments or automations. Spatial audio effects such as the speed of movement, or spread of the sources can be controlled by the performers pad, and those values streamed to touch for a lighting effect that represents the change in spatial audio. 3D visualizer rendering in TouchDesigner allowed me to sit down with the musical artist and get a clearer idea of how the programming would feel in the space.
Derivative: You recently repurposed OCTA for an improvisational live performance system?
Daniel Schaeffer: Repurposing the animations from a choreographed set to a live system was easy with TouchDesigner. I simply swapped what was driving the index of the switch that switches between the different animations from being controlled by Ableton through OSC to being controlled by the radio parameter of a container of radio buttons.
DAY FOR NIGHT SCREENS | Colin Hoy
Colin Hoy is a software developer at AV&C in New York City where his focus is developing real-time control systems for architectural installations. In addition, he spends time freelancing on various interactive projects throughout the year. Apart from Day For Night, a few of his projects from the past year include data visualizations and control systems for Google I/O, New York Fashion Week, and A&E Networks. He uses TouchDesigner as a platform for most of his work due to its extensibility and wide range of applications - in addition to its excellent community.
Derivative: Colin, first and foremost, congratulations for the being in so many places at once and making things look easy and beautiful! Tell us about the system you designed and what it did...
Colin Hoy: The system handled all the interstitial content and playback for the Red (main), Green, and Blue stages. In addition, it handled content playback for the Heineken LED booth at Red stage and the massive dual projector wall behind Red stage – where Ezra Miller's piece was displayed.
All the content was completely generative apart from branded movie clips provided by the festival organizers. Generative design modules included gradient patterns, concentric/line patterns, live typing, zooming type, and countdown timers. Likewise, several interactive elements such as live Twitter feeds and an SMS messaging game were displayed on the massive wall at Red stage.
The system had a fully developed user interface that allowed operators to fully control visual elements on any preview or live output in real time. The visual modules were designed around a background/foreground concept allowing the operator to layer elements together to create a large array of different themes and looks. Likewise, each generative module has an array of parameters that can be adjusted in real-time by the UI to modify the visuals.
Derivative: Can you explain what aspects of TouchDesigner were particularly useful here?
Colin Hoy: The logic of the network is based around a Python Extension that handles the control system such as the A/B deck logic, source selection, parameter selection, video routing, color controls, etc.
Extensions are an extremely powerful feature of TouchDesigner. They not only help in centralizing code, but also in exposing the use of Python classes and functions to drastically expand the functionality and scripting ability of a network. For the visual modules, I wrapped them all in custom components and used custom parameters to control operators for use in real time. This opens the ability to not only easily hookup UI control, but also provides custom control of modules you don't have by default.
For the user interface, I relied heavily on Clones paired with Table COMPs, Replicator COMPs, and Select COMPs. By using these operators in tandem with each other large and complex user interfaces can be made quickly and efficiently. Sweeping changes to the entire interface can be made by simply modifying a master Clone operator – one simply needs to be prudent with the use of Clone Immunes.
Lastly, for compositing my outputs and pixel mapping I used a GLSL shader. All outputs were fed into a GLSL Multi TOP and dropped inside a custom component with custom parameters for positioning, sizing, and opacity within the master raster. The use of the shader optimized the entire network drastically by allowing all compositing to be done only once – and in a single location.
TouchDesigner is a critical part of my workflow on all projects and I drew inspiration from all the amazing support the community has provided over the years. I hope this project can inspire others to learn TouchDesigner and push existing users to explore the many yet unexplored features TouchDesigner has to offer.
Well that about wraps things up in Houston for this instalment of Day For Night! Once again we'd like to thank the artists, the festival and everyone involved for investing such consideration, ingenuity and energy into their work, for the electrified atmosphere and synaptic links invariably sparked… and of course for the warm hospitality we experienced throughout the festival. Needless to say, we anticipate year 3 with barely contained enthusiasm! See you back in Houston!