Aaron Alden artistic practice has been rich and diverse. Starting with minimal abstract painting in the late ’90s and running a small gallery in Denver, he moved to NYC in 2000. Shifting gears, he ventured into advertising and later co-founded a bi-coastal music and sound design company. Aaron now focuses on interactive, generative, and code-based tools. We're mighty pleased that TouchDesigner has become a key part of his creative toolkit.
Derivative: Hi Aaron, could you start by telling us a bit about yourself and your artistic practice?
Aaron Alden: Hello and thanks! I’m a generative artist and composer in Brooklyn, making interactive light sculptures and exploring new ways of using data and code to create (and alter) algorithmic art works.
Derivative: When did you first discover TouchDesigner, and how has it become an essential (or if you prefer, useful) tool in your creative process?
Aaron Alden: I discovered TouchDesigner after many years of creating work through much less convenient methods (manually translating spreadsheets into musical notation, etc). I kick myself for not learning it sooner. Covid was my TouchDesigner deep dive. I now use it for all the things you’d expect (generative, interactive, VFX, LEDs, projections, visualizations) but even for prototyping processes that I eventually recreate in simple microprocessor-based installations. I’ve been making work in all forms of new media since macromedia flash & floppy disk and there are few tools that can combine so many powerful workflows in one package. And TD is the only choice for some emerging tools like real-time, interactive Gaussian splats.
Derivative: We touched on this very briefly at TouchIn NYC, but could you share more about the transition you made from making abstract paintings to working in digital arts? What was your trajectory and what guided you down that path?
Aaron Alden: It was quite a journey! After forming an art collective/gallery in Denver and starting to sell paintings pretty regularly, I moved to NYC in 2000 to be an art star but instead got a big punch in the gut discovering NO ONE cared about paintings at the time. I quit painting, worked in advertising and created a bi-coastal music production company making music for car commercials. It was a great time for many years with a firehose of cash blasting in our direction. Our music was everywhere (10 to 15 commercials on air at any given time) and we had ultimate creative freedom and clients that loved our work. But the bubble eventually burst. The larger music industry is unfortunately permanently damaged. And I won’t say AI will only have a negative effect on it but the ways musicians make money will once again be dramatically compromised. So I decided to look for a new path. I had been making occasional art projects in MAX/MSP for many years but somehow didn’t notice this world was exploding into overlapping genres of magical tech art wizardry. As soon as I started playing simple shapes in TouchDesigner it felt as if several different creative careers were immediately glued together.
Derivative: Weatherbrush is a beautifully artistic project with practical applications as well. What inspired you to create it? Or, put another way, which came first—the weather or the brush?
Aaron Alden: Thanks! Actually, what came first was Lake Heckaman’s YouTube tutorial on using APIs in TouchDesigner which used Visual Crossing’s weather data. Project: Done! (Not quite, of course). Randomly, Lake emailed me about a week later. That’s the beauty of the TD family. Nodes connect, we build amazing things, we share our skills. Cheers to every lovely human I’ve met in this community.
But okay, why weather?
Weather is a common DataVis tutorial topic because the data has outcomes we understand. “Should I bring a jacket today?"
I wanted to learn how to fetch APIs to start working with other “more complicated” streams of data. As I was learning to parse and play with weather I realized this could be a great case study to pitch other projects, showing how we can learn just a few simple rules that allow us to visualize data beyond number-filled charts and graphs.
Another example from my studio pitch deck is the hands of a clock. We completely forget that we’re looking at two sticks spinning on alternate overlapping timescales. We just see "1 o’clock", and usually even have an innate understanding of whether that means AM or PM. I made this project specifically to show that complicated data sets can be communicated through art—that we can create beautiful works of “abstract” art that live on our walls and screens and still contain clear, immediately decipherable data.
Derivative: Could you walk us through how TouchDesigner was essential in transforming 8,400 data points from Visual Crossing’s API into Weatherbrush’s unique visual language?
Aaron Alden: The process starts with an API fetch script (Python) grabbing a full week of hourly forecast data for 10 weather stations along the Hudson River from NYC to Albany. The results are converted to JSON, parsed and processed. Next, each type of data (precipitation, temperature, etc) is adjusted by various math functions to set min/max ranges.
It's important to scale and set limits so normal days and extreme days can both be drawn by the same rules. I researched likely extremes to set the initial values. But as Fall shifted to Winter, the wind readings went nuts and the brushes were jumping off the page. I had to cushion the high end a bit.
Then this ranged data is merged back to one table (DAT) per location. Once drawing is activated, a timer reads through the columns of data and updates the condition of 10 virtual paint brushes. The paint brushes are very thin noise TOPs animating (absTime.seconds baby!) to mimic brush bristles swaying slightly as they move across the page.
So in plain TD terms, it’s 10 noise segments, increasing in density to represent barometric pressure, changing color based on temperature, vertical position based on wind, being displaced (and comped with layer functions) to represent precipitation, smudged and blown out with a layer mask to represent snow, transformed across the canvas, and held on screen with a feedback loop.
And then I have several MovieOut TOPs following the timer to automatically export a movie and still png with explanation overlays (for instagram) baked in.
Derivative: One of Weatherbrush's innovations is the way different weather elements are represented through brushstroke changes. How did TouchDesigner’s animation capabilities help you mimic real-world paint behaviors, like spreading with rain or lifting with snow?
Aaron Alden: At first I had aimed to make this an entirely code-based project. I love p5.js for certain things but for this, the results looked like standard flat DataVis charts. I could only get the lush, expressive “wet paint” results in TouchDesigner. I needed precise control of layering, several noise layers comped together for each brush, evolving at different speeds, different noise seeds so they don’t look like exact copies, a subtle emboss effect to give the feeling of thicker paint, color swatches based on complex gradient ramps that emulate physical paint colors, etc. Anyone who has spent time painting knows that most paint colors aren’t just one hue throughout their tonal range. They can have different characteristics based on the thickness and amount of light that is bouncing through. For instance, orange paint often feels bright and vibrant when thin, but dark and burnt when thicker. I spent two full weeks adjusting color ramps to mimic real-world characteristics of my favorite paints.
There are a hundred little things happening in TouchDesigner that would be impossible (for me at least) to accomplish in code. A lot of this is possible, but for me it was necessary to see it happening in TouchDesigner.
Derivative: With 10 stations over 365 days, the project involves massive data. What TouchDesigner nodes or systems did you rely on to manage and render such a high volume of data-driven animation?
Aaron Alden: It is indeed massive. I’m sure most people would agree that the way you start building a large project is rarely the way you would build it a second time. So I might have to focus on what I’m not doing right. When this is running on my Mac M1 Max I’m getting 7 to 12 fps. Not exactly a live experience. I’m doing a lot more DAT juggling than I should—filtering JSON content, parsing / re-ordering again, analyzing, then merging. It gets very heavy. I’d say doing it all in nodes helped me understand the steps but if I were to rebuild it I would handle a lot more of the processing with better code vs more DAT nodes.
Still, to answer the question, what helped a lot was to divide the project into active steps, focusing processing power on each task and making sure certain comps aren’t cooking until needed. The project has 4 main comps: 1) API fetching / converting to JSON, 2) JSON Data sorting, 3) math and range adjustments, 4) drawing (and that one is HUGE, definitely would make a room full of TD devs gasp).
Derivative: Parallel Tones is another recent work of yours and very different from Weatherbrush. Can you please tell us a bit about this piece and how TouchDesigner was used in the making of?
Aaron Alden: Parallel Tones is a sculptural light installation with 10 LED strips suspended in transparent fabric layers, resembling a minimal abstract painting from either end, revealing distinct individual layers from the side. It feels like a smoke-filled room contained within several thin volumetric slices. TouchDesigner (via TDAbleton) is reacting to a musical score I composed. It is a simple progression through evolving color ramps.
The main chords of the composition trigger envelopes sent to Speed CHOPs pushing the evolution along in steady breathing pulses. This piece is part of a growing set of tools I’m developing for audio-reactive works which create a continual generative loop—the music triggers changes in visual states, which are converted into data which feeds back to Ableton triggering algorithmic changes in the music, and so on. Eternal cascading shifts.
As a painter and now working with light, I believe there is great emotional power in subtle, elegant combinations of light and I’m really enjoying bringing my music into the work as well. The music I write, on its own, has a pretty small audience. But the music I write, combined with beautiful light art: much more universal love.
Interestingly, I stopped refining the colors for this project sooner than expected. I consider myself to have a refined sense of color blending, but as the first (of 6) prototypes was built, I found that mixing colors in transparent physical space allows certain wild color combinations to generate distinctly “feelable" synesthetic impact. Instead of dictating exactly what color combinations it would create I decided to "become part of the experiment” as they say.
Derivative: Looking ahead, you mentioned using this approach for new projects. Could you tell us a bit about other projects in development and further afield, what's on your radar?
Aaron Alden: I’m working on some data-based projects that will be so obvious and far-reaching once they exist in the world that I probably shouldn’t mention them until they are a little further developed (and hopefully, funded).
But I can say, I’ve just been hired to design a generative lighting installation for the entire exterior of a classic (Andy Warhol hangout) building on Broadway in SoHo.
68 windows on 2 sides of a building on 4 floors, running generative patterns synced over Wifi. No problem!
(Artist short-circuits, explodes.)