Through the use of OpenAI’s DALL-E 2 API and TouchDesigner, I’ve managed to create a, let’s say, MIDI sequencer that captures RGB data incoming from AI generated images in real-time, and uses it to trigger MIDI events in Ableton Live.
PANORAMA is an audiovisual performance where a succession of digital fragments develop themselves in complex environments.A hybrid material is formed and distorted by sound, generating immersive and hypnotic digital landscapes.With the help of DATAS (time, sound, visual, algorithmic), TS/CN structur
TDAbleton is a tool for linking TouchDesigner tightly with Ableton Live. It offers full access to most everything going on in an Ableton set, both for viewing and setting.
In the following video, you’re going to learn how to control both analog and digital instruments through movement with the help of TouchDesigner, Ableton Live and a Kinect camera.
En el siguiente video, aprenderás a controlar instrumentos analógicos y digitales a través de movimiento con la ayuda de TouchDesigner, Ableton Live y una cámara Kinect.
INFRATONAL is the audiovisual project of Parisian multidisciplinary artist Louk Amidou who uses hand gesture to perform both music and visuals in the most mesmerizing compositions while exploring the links between the human and the algorithms.
Tokyo Developers Study WeekendTouchDesigner Vol.052 Hyper Audio Reactive with TDAbleton -Music and Visuals made by sampling Date 16th January 2022 15:00-18:00 UTC+9
Been working for quite a bit in a way to transform image into sound, and came up with a hopefully-interesting idea. In this tutorial you’re going to learn how to transform RGB data into MIDI in TouchDesigner, and feed it to Ableton Live.
https://vimeo.com/581830541 Hey guys! Some friends and I recently made a LED light installation for our graduation degree at Gobelins, a French art school in Paris. ORCHA-103 is an interactive musical and visual installation that invites its visitors to play the role of a conductor.