Hey there, here is my first contribution to the community ! I recently saw a really cool installation by Ryoichi Kurokawa on his Instagram.
Community
TDSW will be hosting a TouchDesigner workshop
TouchDesigner Vol.038 Real-time animations with Tsumikiroom
Join us Friday February 26th for InSession 10.0 when we will be joined by Dr Betty Sargeant and Justin Dwyer of the award-winning Melborne-based art duo PluginHUMAN.
InSession 10.0
In this tutorial we look at how to create a basic interactive particle system by using the Kinect v2 (works with Azure too), Optical Flow and particlesGPU. We especially have a closer look at the particlesGPU, so you can easily customize the look and behaviour of the particles to your needs.
Interactive Particles with Kinect
In this tutorial we create a component which can detect when objects are on top of each other (colliding). This works for any 2D input with alpha. You could also render a 3D scene and use it there, it just works on the u v axis though.
2D Collision Detection

In this tutorial we break apart 3D objects likes boxes and spheres by using the point data for instancing and combining this data with noise.
3D Shape Subdivisions

In this tutorial we create a virtual space with walls and 3D objects to simulate projections and installation setups.
Virtual Projections (Prototyping)
![[ i miss your touch ] transports two people who are in separate locations into a shared virtual environment. It allows people to experience virtual touch. It was created by PluginHUMAN (Justin Dwyer and Betty Sargeant) in 2020. PluginHUMAN_Your Touch_10S.jpg](https://derivative.ca/sites/default/files/styles/project_teaser_small/public/field/image/PluginHUMAN_Your%20Touch_10S.jpg)
https://youtu.be/Y8GAWNALfGs [ i miss your touch ] by PluginHUMAN transports two people who are in separate locations into a shared virtual environment. This unique online platform launched in March 2020 as a rapid response to pandemic conditions.
[ i miss your touch ] by PluginHUMAN. A virtual touch artwork

Control Room 42 (CR42) a project from RTBF, public broadcaster for the French speaking part of Belgium, gives broadcasting's traditionally hardware-based control room a radical makeover enabled by TouchDesigner in ways its designer Hugo Ortiz thought impossible a few years ago.
Control Room 42 Ushers in the Future of Broadcasting

I have always seen the Render Pick function, mainly used with the mouse. However, I have always wanted to control it with a Kinect and finally in the last months I came up with a personal solution that allowed me that!
Kinect Render Pick

Simple CHOP out of Easing Functions You can also set the number of samples List of functions:
Easing Functions CHOP

DOWNLOAD We are very pleased to announce TouchDesigner's 2021 Official Update and tell you about all the exciting features of this release. This year's update touches (pun intended) so many parts of TouchDesigner.
2021 Official Update

Version 0.11 of RayTK is now available on GitHub! Along with that, there is now a documentation website with concept guides and an operator reference. RayTK 0.11 Download RayTK documentation site 0.11 supports TD 2020.44350 (experimental) and the newly released 2021.10330 (official).
RayTK Version 0.11 and documentation website

Sorry for the messy toe Would love to see someone creating a little game or anything out of this. Feel free to post your walk with the spider and tag https://www.instagram.com/josefluispelz/ walk with WASD | reset with "shirt+r" | home with "h"
Generative spider character

Herewith a simple but powerful colorpicker widget for Touchdesigner which you can use in your own user interfaces.
Colorpicker UI widget: v1.3


Here is templates for three different approaches to visualize Kinect joints data: points, lines and metaballs with some custom parameters. In the video tuitorial I explained it in more details and share some ideas how you can use it later to create very different looking outputs.
3 templates for Kinect skeleton visualization

https://vimeo.com/508440260 the.collapse live performance has been created early 2021 based on the eponymous album released on ETER Lab during the pandemic crisis à, 20/03/2020. All visual aesthetic is based upon old failed polaroid which let only see the frozen chemical process.
THE.COLLAPSE A/V live performance

UberGui is a lightweight multi-threaded, webRender UI module for TouchDesigner projects. The aim is to solve the trifecta of challenges building UI's in TouchDesigner often poses: being fast, feature rich, and visually appealing.
UberGui V4 - A lightweight multi-threaded, WebRender UI module for Custom Parameters

Here's a tip for a constant calculation of blur depth, even when geometries are in motion. Thank you to everyone who came to the channel and subscribed. If you have not already done so, it will be a great motivation to continue working on more tutorials.
TOUCHDESIGNER - REFLECTIONS - DEPTH BLUR // PART 2/3

A few months ago I started a hobby project to build a custom C++ operator with the C++ SOP in order to run the Houdini Engine inside of TouchDesigner to load HDA files at runtime and leverage the powerful Houdini SOP context.
Houdini Engine for TouchDesigner proof of concept
