Close
Tutorial

[Walkthrough] Transforming DALL-E 2 generated images into sound [MIDI events] For Intermediate

Through the use of OpenAI’s DALL-E 2 API and TouchDesigner, I’ve managed to create a, let’s say, MIDI sequencer that captures RGB data incoming from AI generated images in real-time, and uses it to trigger MIDI events in Ableton Live. Said signals are then feeding, among other things, two wavetables in the first two examples, and my lovely new Prophet 6 in the other ones.

In short: The idea was to have a system that, given certain concepts (a.k.a prompts), generates chord progressions from the RGB incoming data resulting from the DALL-E generated images.

Concept [prompt] ➨ AI generated image [DALL-E 2] ➨ capturing RGB data in real-time [TouchDesigner] ➨ using that data to trigger MIDI events [Ableton Live]

For more experiments, tutorials, and project files, you can head over to: https://linktr.ee/uisato

 

Experience level 

Comments