When: 30th May – 7th June 2026 (arrive 29th May for Meet & Greet in the evening) Location: Royal Ballet and Opera, Bow St, London WC2E 9DD Application Deadline 31/03/2026 Application Form: https://forms.gle/9BuNrkcZS8Vmf2nh7 Application Guideline: Download from here
AI

Automatic BPM detection and synchronization module, powered by a Temporal Convolutional Network trained on 7,000+ tracks spanning jazz, techno, footwork, and more. Setup 1. Open the project and enter the AutoBpm container. 2. Locate the tdPyEnvManager palette component. 3.
Auto BPM Sync with AI

Montreal media artist Matthew Biederman has spent the past two decades wielding data, light, and open-source code as instruments of cultural resistance.
Commons, Cameras, and Code: Art as Resistance with Matthew Biederman
https://www.youtube.com/shorts/wP0p7cCs6yM This TouchDesigner component lets you generate high-quality sound effects incredibly fast directly inside TD using the ElevenLabs Sound Generation API, then automatically downloads and plays them in your network with a clean, one-button workflow.Demo
ElevenLabs SFX - [TouchDesigner Component Release]

Link to video-demonstration: https://www.youtube.com/shorts/O7Mg2poqXdE Featuring: - Text-to-video & Image-to-video - 16:9 & 9:16 (720p/1080p) - Negative prompt + optional seed - Async auto-download + auto-playback - PDF setup guide included Three one-prompt commercials, one heroic banana.
[Release] SORA-2 Video Generator

AIMods_Sora2 + PROTOTYPING_UI - V1 After The intégration of Nano Banana here is a new AIMod : AIMods_Sora2, a TouchDesigner module that brings SORA2 API call into your TD workflow. It supports both TXT2VIDEO and IMG2VIDEO, with a lightweight prototyping UI for rapid experimentation.
SORA2 INSIDE TOUCHDESIGNER
In this tutorial, we'll dive into Torin's new DepthAnything plugin for TouchDesigner—a drop-in .tox that turns any webcam or TOP into a real-time depth map (no extra installs or special hardware) for Mac + PC.
Depth Anything TouchDesigner Plugin Tutorial

https://www.youtube.com/watch?v=HsU94xsnKqE A complex AI live-style performance, introducing Camille. In her performance, gestures control harmony; AI lip/hand transfer aligns the avatar to the music.
[Experiment] On-AI-R #1: Camille - Complex AI-Driven Musical Performance

Learn how to control ComfyUI workflows directly from within TouchDesigner using the ComfyTD component by DotSimulate. Marco covers how to link parameters and inputs between the two environments, enabling real-time control over AI image generation right inside your TouchDesigner network.




