
https://www.youtube.com/watch?v=HsU94xsnKqE A complex AI live-style performance, introducing Camille. In her performance, gestures control harmony; AI lip/hand transfer aligns the avatar to the music.

https://www.youtube.com/watch?v=HsU94xsnKqE A complex AI live-style performance, introducing Camille. In her performance, gestures control harmony; AI lip/hand transfer aligns the avatar to the music.


Hi everyone! Here is my first share with the community, hope it's useful! Last month I got the AKAI APCmini mk2 to use it in my TD projects and was quite dissapointed to see that none of the LEDs gave light feedback when not used with Ableton.

Hi all Here an easy component for connect nanoKontrol Studio to TouchDesigner.

I have created a component with MIDI learn functionality and other features for MIDI controllers. Check my guithub: https://github.com/duarte-amorim/Touchdesigner/tree/main/Midi_Learn_Controller Perform Mode Example Network Example Global Parameters Example Button Parameters Example

Through the use of OpenAI’s DALL-E 2 API and TouchDesigner, I’ve managed to create a, let’s say, MIDI sequencer that captures RGB data incoming from AI generated images in real-time, and uses it to trigger MIDI events in Ableton Live.

Do you want a MIDI controller but don’t know what to get or where to start? Look no further! This MIDI guide is just what you’re looking for. The MIDI devices covered are by no means the best MIDI controller out there, but what I use in my toolkit and what I find handy without breaking the bank.

One of the most exciting parts of VST support in TouchDesigner is being able to generate real-time sound environments using VST synth instruments. The tricky part is that these features are so new, you probably don't know how to get it all setup.

TouchDesigner => ChucK => TouchDesigner; Download it: https://github.com/DBraun/ChucKDesigner/ ChucKDesigner is an integration of the ChucK music/audio programming language with TouchDesigner.