We've been enormously inspired by these quick visual explorations made by London-based contemporary classical composer Benjamin Heim. We also noticed that Ben's A/V performance “Essays in Interaction" was premiering soon at Splice Festival May 12 and wanted to find out what he was planning and how TouchDesigner might be involved. Ben was gracious to explain his transition from working with sound to creating audiovisual compositions, and how looking for ways to make music and the visuals more integrated and reactive led him to TouchDesigner.
Derivative: Ben please tell us a bit about yourself and your background.
Ben Heim: I am a contemporary classical composer with a focus on the intersection between technological innovation and expressive performance practice. So my background began in instrumental and electronic composition and has since branched out into immersive events, projects with machine learning, live visuals, and a few more exciting areas that I can't talk about yet! I'm basically a junkie for any kind of emerging technology that I can create music and art with. I also do a bit of film music here and there.
D: I'm assuming your practice started with sound and then visual? What was that progression and what inspired it?
BH: I always had a strong interest in visuals in relation to photography and videography but the first time that I really connected that with music was with an immersive concert series that I co-founded back in Australia called Argo. The idea was to create immersive concerts tailored to unique venues that combined contemporary classical and electronic music in surround sound.
We ended up booking a show at St John's Cathedral in Brisbane which has this amazing vaulted ceiling and lovely acoustic and we thought it would be amazing if we could project visuals over the roof and walls of the venue. So I ended up down at the beach filming waves and bubbles from under the water and combing that with a few basic geometric sequences made in Apple Motion to create visuals for the event.
From there we did two more events involving visuals but I was always looking for ways to make the music and the visuals more integrated and reactive, which eventually brought me to TouchDesigner. Fast-forward to the present day and I am addicted to make as close and intimate connections between visuals and audio as I possibly can. It is an absolute pleasure to work with TouchDesigner in combination with Ableton Live and Max/Msp and explore the incredible powerful ways audio and visuals can be seamlessly melded.
D: How did you discover TouchDesigner and why did you start using it?
BH: I discovered TouchDesigner when I was just gearing up to produce my first immersive event in London (Phantoms). The event was based around live-streaming technology, and the general concept was to have three musicians in three different rooms performing simultaneously and then combining the video feeds together in an abstract way to produce an event that could be experienced live, as well as streamed online. I also really wanted there to be a way for the online audience to interact with the live visuals somehow.
Originally I was planning to use Resolume for this project, but I saw a post on CDM blog announcing the release of TouchDesigner 099 and it seemed like the ideal software for this event so I went with it. It turned out to be perfect for handling the multiple streams of videos and online interaction. I also think it was really good for me to have an event that I was going to do with TouchDesigner because it meant I had learn my way around it quickly!
D: What is your motivation for the visual explorations you are doing with TouchDesigner? Like the diffusions and the oscilloscope Instagram posts for example?
BH: I think the core of what I am always trying to create with TouchDesigner is an interactive audiovisual system which I can perform with, and in which there is a deep two-way connection between visuals and sound (so sound creates visuals and visuals also create sound). The first project I built that acts like this was 'Playing Chaos', it involved a particle system where I could play a midi keyboard and particles would be released at different places in space and strike velocity would determine the speed the particle was released at.
The particles would then bounce off the edges of the screen and split. I then grabbed the data from the particle system and send it to Max/Msp so that new notes would be triggered every time the particles hit a wall and splits. I also added turbulence effects to the particles and a system that changes the volume of backing drones depending on pixel values in the final image.
The diffusion stuff is a new experimental area for me. I love how artists such as 404.Zero can create incredibly detailed visuals using these kinds of effects and would like to incorporate them into my own work, so I have been documenting my experimentation with it on Insta! It's a very unique way of working, because it is such a natural evolving process it is quite a challenge to form a concrete connection with it and audio.
The oscilloscope stuff was born out of my love of analogue synths and wanting to turn the pure sound into visuals. There is no way as pure as the oscilloscope! The great thing about TouchDesigner is that it allows me to go far beyond what a normal lissajous figure can be. A few examples that you will see at Splice Festival include three dimensional lissajous figures created by using three separate signals to drive the x, y and z displacements, as well as me grabbing data from the visuals and sending it back to the synth as control voltage to create a very unique feedback system.
D: Can you "leak" a little bit about your upcoming performance at Splice Festival and future creative plans...
BH: I have titled my Splice performance "Essays in Interaction" as I see it like a big exploration of how I can interact with audio and visual elements as a performer. I have built a number of these two-way audiovisual systems (previously discussed) that I will perform on over the course of the evening. These range from the oscilloscope and particles systems; to a keyboard based system in which notes create swirling colour clouds which are then sonified using an array of 300 bandpass filters; to kinetic MIDI drum performances and a minimal line-based feedback system controlled by musical elements.
My future creative plans are extremely extensive. I will be continuing to perform as an audiovisual artist, and continue working on my material for a visual album style release. I also have a violin and electronics duo that I am currently retrofitting with live visuals, so you should also be hearing more from that project soon. Further down the track I have a few projects in the pipeline in the areas of machine learning, immersive events, and virtual reality.
Many thanks to Ben for letting us distract him from his preparations for his upcoming Splice performance, and a reminder to our UK-based readers (and anyone attending the festival) to go and see Ben's A/V performance “Essays in Interaction" premiering at Splice Festival May 12!!