
In this beginner tutorial, we'll dive into hand tracking with MediaPipe and building an interactive watercolor effect in TouchDesigner!
In this beginner tutorial, we'll dive into hand tracking with MediaPipe and building an interactive watercolor effect in TouchDesigner!
In this tutorial, we'll create a Brutalist-inspired Generative Architecture piece using TouchDesigner. We’ll explore instancing, physically based rendering (PBR), and hand tracking with MediaPipe for interaction.
In this video, Scott Mann showcases how to integrate a ComfyUI workflow into TouchDesigner, and then how to build a fun AI-powered photobooth that captures four photos from a webcam, processes them using Stable Diffusion, and saves the enhanced images to your computer.
I'm so excited to launch this three-part Master Class on Hand Tracking in TouchDesigner with MediaPipe. I’ve taken insights from my Genuary artworks (a 30-day generative art challenge in January) and distilled them into a series of tutorials.
Using Stable Diffusion with TouchDesigner opens up innovative and creative opportunities with just a few lines of code. In this post we look at three Stable Diffusion models we can use in TouchDesigner: Structure, Style and Stable Fast 3D. To continue reading this post, click here.
An introduction to LOPs (Language Operators), a new operator family for TouchDesigner that enables LLM integration directly in networks.
In this tutorial we'll integrate Stable Diffusion in TouchDesigner in two ways. First by creating a text-to-image generation system, and then using that to generate a video for further manipulation. To continue reading this post, click here.
In this video, creative technologist Scott Mann takes you through his setup using TouchDesigner and ComfyUI to create audio-reactive generative artworks.
https://thenodeinstitute.org/courses/ws24-td-01-integrating-ai-with-touchdesigner/