TouchDesigner 099 on Windows and macOS is now official!
With it comes a profusion of new capabilities, both planned and user-driven, and all powered by TouchDesigner’s high-performance, reliable realtime engine. The core of TouchDesigner remains its family of 400+ unique node operators, boosted with Python 3.5 scripting.
Already a leader in high-throughput video playback, processing and display, TouchDesigner 099 adds support for the latest SDI video cards up to 4K and 12G, with low-latency, high-frame-rate and deep pixel-depth. This complements TouchDesigner's support of multi-computer, multi-GPU, multi-outs to Displayport and HDMI.
TouchDesigner supports a wide range of devices, protocols and external tools that inter-operate via their respective Operators and TouchDesigner Python methods.
Image destinations are on screens, on projections, on LEDs in any configuration and lasers aimed at anything. Getting video between TouchDesigner and other systems now includes Syphon/Spout along with the streaming over IP of H.264, Newtek NDI and HAP video.
PBR (physically-based rendering) in TouchDesigner combines more realistic surface materials and environment illumination. TouchDesigner now fully-imports and embeds Substance Designer materials by generating multi-layer textures for PBR materials.
With new VR tracking devices and Oculus Audio 3D spatialization, we have had a glimpse of the diverse uses of VR and its hardware with TouchDesigner. To that end, we built an efficient pre-made VR Environment for users to adapt and extend.
Our most popular component, Kantan Mapper for mapping video onto shapes has been fully re-engineered for easier, more complex and rich mappings.
And in the drive for greater connectivity to the media and data around us, a fully interactive web browser component based on the Chrome/Chromium engine enables you to embed web interactions into your TouchDesigner projects.
Below are some highlights, and for more insight into TouchDesigner 099 see the What's New in 099 page.
01.PHYSICALLY BASED RENDERING & SUBSTANCE DESIGNER
Things look sleeker and indeed, more 'real' with improvements by way of physically based rendering (PBR). This is enabled through a number of new features starting with the PBR MAT, a new Operator that creates physically based materials from texture maps assigned to it. It works with any content pipeline whether you use Maya, Houdini, Unreal, Photoshop etc or allows you to use PBR texture libraries such as Quixel and Poliigon.
02.DIVE INTO VIRTUAL REALITY WITH HTC VIVE
Dive into VR with the newly released HTC Vive Development Environment which provides a quick-start .toe file, tips and full documentation of the system. This environment also lets you author your project while inside the VR world through a virtual TouchDesigner workstation. Edit and design without having to take off the HMD for every creative impulse.
"The PBR material and Substance Designer integration are very exciting to me, providing a new level of creative control and allowing for easy blending of game-style texture techniques with the more customizable TouchDesigner techniques. TouchDesigner's VR integration is making it so easy to create VR experiences that utilize quick-to-construct A/V environments that are typically more time-intensive to make in a game engine." - Peter Sistrom
The above demo shows the HTC tracking system used for realtime projection mapping onto moving surfaces. Using some rubber bands and a calibrated projector, the controller allows physical interaction with projected surfaces. This experiment was made in a weekend by two artists. - Harvey Moon
In this experiment David Braun holds two Vive controllers that serve as "fixed" or pinned edges in a GPU cloth simulation. Two edges of the cloth are attached to the controllers, and all of the simulation points in between are free to drape and sway. As shown in the video, the index finger triggers on the controllers can be pulled to release thousands of particles. When the particles are in the air, a second trigger disperses them with four dimensional Curl noise. Both triggers can be automated through audio analysis, but David prefers the experience of "playing" the song as opposed to passively listening and watching.
03.COMPOSITING POWER RAMPS UP WITH NEW TOPS
TOPs are the graphical powerhouse of any TouchDesigner system. A GPU-based, high-resolution, realtime compositing system that is a joy to experiment and explore ideas with. One of the favorite TOPs in most people's toolbox is the Noise TOP. The Noise TOP has long had 3D noise functions, but now using a new 2nd input you can do a lookup into any position in the 3D Noise space. The RGB values of the 2nd input are used as UVW coordinates into the 3D Noise space, saving you from writing a GLSL shader to do this kind of lookup.
Understanding GLSL materials in TouchDesigner is all about learning to work with the bits and pieces that go into rendering. These networks have a view of several different experiments arranged for fast exploration and fostering deeper understanding of how to take better advantage of the power of working in TouchDesigner.
"The new Compute Shaders rule! The ability to write to arbitrary coordinates of the output image is taking GPU programming in TouchDesigner to the next level. Also with the new ability to privatize components I can now finally release Luminosity for others to use." - Keith Lostracco
David Braun's images of dots and lines being pushed in 3D space can be seen with a VR headset also. The forces moving the particles are a combination of Curl noise, polygonal forces, and forces derived from images. Lots of TouchDesigner 099 containers and savable presets control the uniforms of the shaders that move the particles and the shaders that render them.
"TouchDesigner 099's GLSL shader system is the most convenient fragment shader playground I've ever encountered. The network editor makes it easy to design complex feedback systems because every image in the pipeline can be seen. It's easy to inspect the RGBA values of pixels, rows, and columns of images. Additionally, these values can be trailed over time, which is a great debugging technique not as easily available in WebGL or C++ environments. TouchDesigner is where pros should spend their time crafting shaders and where beginners should get started." - David Braun, Leviathan
KantanMapper 2 is a full re-design of the most popular TouchDesigner component, KantanMapper. It is re-engineered with newly-developed shape-editing, unlimited layers of simple to free-form shapes, and a user interface to map and transform any video onto any shape, and methodologies to better-manage complexity and extensibility, The UI is independent of the engine allowing external control of everything.
05.REACH OUT INTO THE WORLD WIDE WEB
The Web DAT and WebSocket DAT have been a part of TouchDesigner for some time, letting you reach out to access and interact with the vastness of the net. To speed things up, we have introduced a pre-built Threaded Web Component to access APIs like Twitter, Instagram and similar services.
Another new addition this toolset is the Web Render TOP which lets you render web pages to use inside TouchDesigner projects!
A new basic Web COMP example shows how to make a web browser with webpage interaction within a control panel inside TouchDesigner opening a myriad of possibilities from HTML5-built control panels to customer web server interfaces that can be embedded into TouchDesigner projects.
06.DEVICES | CAMERAS
Those using DMX or working with LEDs will be thrilled that the DMX Out CHOP now supports sACN devices and Multi-cast! This will allow many more devices to be used and with much higher numbers of channels/samples.
Mattia Diomedi's reactive dance performance V used BlackTrax to track and record dancers' performance into the Record CHOP (or gestureCapture palette component) to create the visual interaction 'offline'.
Camera, Cameras, Cameras!!! Native SDK support for Point Grey, Bluefish444, and AJA has been added. Bluefish and AJA can support 4K input capture and output. RealSense support has been improved to work with the SR300 camera and adds segmented color image and person-tracking.
07.SCRIPTING AND PROGRAMMING
Fans of openFrameworks can check out the openFrameworks page for examples and a walk-through.
On Fluid Structure I used custom GLSL shaders, which are always instrumental to my work. They were first used to efficiently control the individual positions, scaling and translations of 100,000+ geometry instances, both coming from the fluid simulation and from the Kinect data. The GPU particles sample in the palette updated for 099, shows a similar technique. Shaders were also used to compute the various forces driving the fluid simulation, which were derived from the Kinect depth map. The optical flow palette sample, now in 099, is a good example of that. Last but not least, shaders were used to quickly tweak the look of the animated reflections on the fluid as well as the final look, compositing the various passes that make up the final render in a custom manner. - Vincent Houzé
GLSL support has been updated to the latest versions and the GLSL TOP now supports Compute Shaders.
"With most software packages I have reached its limits within a few months. After years of using TouchDesigner fulltime - and on very large shows - I'm still discovering new possibilities to create magic. It's like a crate of Lego which is always deeper than you think." - Idzard Kwadijk