Close
Company Post

Exploring Humanized Algorithmic Art with INFRATONAL

INFRATONAL is the audiovisual project of Parisian multidisciplinary artist Louk Amidou who uses hand gesture to perform both music and visuals in the most mesmerizing compositions while exploring the links between the human and the algorithms. After cutting his teeth as a teenager in the 1990s Atari Demoscene followed by an education in art and computer graphics, a frustrated Amidou left the field completely with the promise to return when computers and techniques had progressed enough to allow for more fluidity and autonomy in the creation process. Now 20 years later having discovered TouchDesigner as the “perfect tool to access the full potential of real-time digital creation with or without coding” INFRATONAL is also Amidous’ gloriously jubilant come-back-to-graphics. Promises made and kept. We were fascinated to learn more about Louk, his creation process and future plans, and are very pleased to announce that he is releasing here to the community Instant Probe, a small utility named in honor of the great tool in the palette, he tells us, to help optimize the network by design. Enjoy!

 

Derivative: Can you tell us a bit about your background?

Louk Amidou: I studied art and computer graphics (in addition to digital communication studies), but my first creative and coding school, so to speak, was the Atari Demoscene in the '90s when I was a teenager. I was dedicated and passionate about coding, designing graphics, and composing music in what was then called demos. There were technical or creative experiments where we pushed personal computer capabilities to their limits and advanced them beyond traditional usage. After my studies, I was involved in several musical projects before making a professional career by leading an agency in the digital communication field. For the past few years, I’ve returned to more creative projects.

 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

A post shared by Infratonal (@infratonal_av)

 

Derivative: What attracted you to TouchDesigner and how does it fit into your workflow?

Louk Amidou: The anecdote is, after one year of programming a movie in C Language, I completely gave up graphic design at the end of my computer graphics studies. At that time, it was the beginning of computer graphics as an industry: the period of Softimage and Silicon Graphics stations. It was a challenge to express spontaneous creativity, the rendering times were long and the work on 3D-generated images was tricky. I struggled to get the result I was looking for. I made a promise to myself that I would come back to 3D image design the day real-time techniques would allow me to control the results more intuitively.

20 years later, enter TouchDesigner. I began to use the software in 2020, which was a bad year for many reasons but a perfect year to deep-dive into creative coding. TouchDesigner gave me the possibilities to realize the promise I had made to myself all those years ago.

TouchDesigner is, for me, the perfect tool to access the full potential of real-time digital creation with or without coding. It's also very playful to use, and that's a fundamental characteristic.

Derivative: Can you explain more what you are using TouchDesigner for to realize your work?

Louk Amidou: TouchDesigner is the primary tool I'm using for my creative projects. It fits perfectly with my approach by allowing me to manipulate media and data in real-time, and I want to leverage that. Coming from a musical practice, I'm using TouchDesigner to play the algorithmic visuals and sounds like an instrument. My pieces are almost never recorded with keyframing or parametric algorithms, but they are performed and recorded in real-time to inject a human touch.

I'm using TouchDesigner for two specific functions: to produce images and to organize the digital flux into a central system. TouchDesigner is therefore the link between algorithm and spontaneous human intention.

For this reason, there are three specific steps in my process: 

  • The building process, where I work on the system, which is like creating the instrument.   
  • The conception or composition, where I compose the visual and musical creative forms.   
  • The performance, where I use the system like an instrument to produce the final result.

On the musical side, the composition and sound design is done with Ableton and Max.

I performed and composed music in a "traditional" way for years, but now I'm more and more into working on alternative ways to compose beyond chord progressions, rhythm and tempo. I'm very influenced by the concept behind "the music concrete" from the 1950s in France. I try to use the sound as the instrument where the movement is the performative gesture for a "sonic travel inside prepared sound masses". With the evolution of sensor technologies and computer vision, it's fascinating to experiment with the future of musical performance without the proper physical instrument.

TouchDesigner enables building hundreds of new invisible instruments to experiment with alternative ways in which music and visuals can be performed.

Derivative: Perhaps you can explain your process in greater detail. I understand it’s TouchDesigner for the visual and Ableton + Max for music but if you can break it down please?

Louk Amidou: On the technical side, I'm working on a unique TouchDesigner modular network that I'm continuously improving. It is composed of:   

  • Input modules that receive all kinds of controllers, sensors and incoming data. Then I can normalize them to be used in the other parts of the system.
  • A matching module, which relates the data I choose and the music or the visual. That's where I concentrate a lot to find new ways to match smoothly and logically.   
  • Different scene modules where I build the visuals and the link with the music. 
  • A switch module where I can navigate between tracks, scenes etc.   
  • Several post rendering and compositing modules.   
  • A recording module.   
  • ...

    In my studio, I use Mac with Ableton and Max for the sound design and music, and a PC to produce images with TouchDesigner. Both computers communicate with OSC and audio with Dante. On the PC side, I use a variety of controllers and sensors (Midi, Leap Motion, RealSense, a touchscreen...). A camera is continuously filming the scene which allows me to obtain images and data from human interaction. These images are sent back to TouchDesigner for live recording or real-time compositing.This integrated system enables smoother experimentation on the dialogue between human intention and digital forms (both visual and musical), and it allows me to play visuals like I play music with an "algorithmically intelligent instrument."

Derivative: How does being able to work in real-time influence your practice?

Louk Amidou: Real-time digital image production is fundamental in my practice. It could be seen as a limitation, but actually it is a source of creativity.

As the real-time work imposes close monitoring of FPS, I developed a small module called "Instant Probe" (in honor of the great tool in the palette) to help the building process and to "optimize the network by design".

It allows instant measurement of CPU Time and GPU time consumed by a selection of operators. In addition, you can compare two selection. It's a straightforward tool, and it is very handy for my process to make the right choices when I'm building the system. I've just rebuilt a more user-friendly version I'm sharing here as .Tox and hope it could be helpful to others as well.

Derivative: Can you talk about how the elements of sound, image and gesture come together in your art?

Louk Amidou: I'm looking for a poetic and creative dialogue between image, sound and gesture. I try to use the algorithmic sound and image representation as an instrument where the interface disappears and where we can directly play visuals and sounds by manipulating them with our hands (touchscreen or air gesture). For this, I work on the hybridization of the form and the creative practices.

The usage and inclusion of the human element are essential in my work. I experiment how we could, as artists, still be part of the algorithmic art that is rising and which is more and more autonomous. The work on the integration of a meaningful human gesture or intention remains an interesting way to preserve the human mark and imperfection in the algorithmic work.

Creators and artists have no other choice than to embrace the future of creation technology. More than ever they need software like TouchDesigner, allowing coders and artists the spontaneous, agile and fertile experimentation of human integration in digital works of art.

 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

A post shared by Infratonal (@infratonal_av)

 

Derivative: How you are getting the above 3D effect?

Louk Amidou: I work mainly on 3D elements and almost all of my pieces use real elements such as backgrounds or hand gestures. I always use an AR-like composition of those real elements in a 3D environment or the other way around. For the 3D form I like to use techniques of modelization by accumulation (instances, particles, point clouds...), allowing TOP processing in TouchDesigner. To me, compared to the vertices modelling approach, this method is extremely interesting for creative outcomes.

Derivative: And how do you get the nice shading?

Louk Amidou: I am always very interested in working on visual rendering optimization as well as sound mastering, which is a pretty similar process. This visual rendering pipeline is pretty common: TOP instancing + PBR + SSAO + compositing.

But in my creative process, I think a lot about what we could call "UI Coding Abstraction", which is basically how we can manipulate parameters. I think it drastically changes what you create in the end.

I try to code only when I build the system. When it comes to conception and composition tasks, it needs a creative freedom, and it's necessary to let the ideas flow faster. It's much more of a node-based practice, but it's always complicated to navigate among the huge possibilities of parameters to find the right combination for rendering. I then often use midi-controllers for conception. I map the rendering parameters of my structure to iterate them with 'parametric controls' which are provided by midi-controllers. When I find interesting combinations, I store them. I can use them in the performance with another level of abstraction: the 'Sensitive control' with free gesture (Leap Motion, Camera CV, touchscreen...), which could bring a more expressive approach.

 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

A post shared by Infratonal (@infratonal_av)

 

Derivative: Are there any new technologies or creative solutions you are seeing that excite you right now?

Louk Amidou: It's a problem because the list of exciting topics grows day-by-day. I am always interested in exploring innovative techniques to create pieces, and recently I'm particularly interested in different AI technologies for recognition and generation capabilities. Once again, TouchDesigner, by its integration of python and a library like OpenCV is a particularly relevant tool to approach some of these technologies in a very concrete way.

AI is interesting from a technological point of view but the global approach around autonomous algorithms is critical for all digital creators and we can't ignore this subject.

Derivative: What is in the future for you this year in terms of work and focus?

Louk Amidou: For the main things, I'm preparing an A/V performance of my work where I'm interested in sharing the performative gesture we lost in the digital and electronic A/V performance. This year I am also launching my interaction design and artistic production studio.

Follow Infratonal on Instagram | Vimeo | Soundcloud | Twitter


And a big thank you to Louk for taking the time to talk to us and for sharing INSTANT PROBE, his monitoring tool inspired by the Probe tool in the TouchDesigner palette. It could be helpful when building a network to choose architecture or during optimization to visually understand the CPU and GPU usage level by a specific part of the network. 

 

Comments