Close
Community Post

Building a Rhythm Game System in TouchDesigner

Description

This post is a technical overview of how we built the rhythm game system for the XEEKIN project, which was introduced in a previous post.A video of the project can be found at the link below.

The motivation behind developing a rhythm game system entirely inside TouchDesigner was the need for strong, tightly coupled visual interaction. Rather than using an external game engine, we wanted all timing, interaction, and visual logic to exist within a single real-time environment.

 

tech overview

The following is a summary of the system built by touchdesigner.

The system developed in TouchDesigner is divided into four main components:

  • Rhythm Game System

  • Arduino-Based Input

  • Visuals & Simulation

  • Media Server & Projection Mapping

 

Rhythm game

To implement a rhythm game system in TouchDesigner, it is essential to synchronize BPM, beat, and TouchDesigner’s internal engine timing.

The core principle is updating the current beat by calculating BPM and FPS based on absTime.

Initially, we experimented with subtracting the time difference between frames to compensate for frame delay when generating notes. However, this approach caused synchronization drift in practice and was ultimately removed from the final system.

Based on the calculated currentBeat, note generation data—such as timing and position—is stored in Table DATs and CSV files, allowing notes to be queried and triggered at any time.

Manually inserting note data into Table DATs proved inefficient. To address this, we developed an algorithm that extracts timing data from audio and sheet music and converts it into structured Table DAT data usable directly inside TouchDesigner.

The image above shows a debugging view for desktop testing. A separate visual output mapped directly onto the drum pad was generated using the same logic.

Substituting notes to be generated in time into table DAT is so inefficient that I developed an algorithm that extracts data from audio and sheet music and converts it into table DAT.

 

Additionally, I also developed a touchdesigner internal 3D camera that allows you to select songs while hitting the drum pad and a UI system using PBR and CameraBlend COMP.

 

visual & simulation

We created a visual environment that reflects judgment results in real time. The thumbnails above show examples of the corresponding visual outputs.

 

 

 

In addition, we developed a simulation system that allows us to preview how the visuals would appear in the actual mapped environment before installation.

 

Arduino

Piezo sensors detect vibration from the drum hits. When a hit is detected, LEDs inside the drum pad are triggered, and the sensor data is transmitted to the computer via USB serial communication.

The drum frame was custom-built, with internal LEDs installed to visually respond whenever the drum pad is struck.

 

Projection Mappping

The visuals were mapped onto the floor and wall using two Epson EB-L1505UH projectors. Overlapping areas were blended seamlessly using edge blending techniques to create a natural projection result.

 

conclusion

This post is not intended to present a definitive or “best” way to build a rhythm game system. Instead, it documents one possible approach to exploring TouchDesigner’s potential beyond visual generation.

By sharing this process, we hope to encourage further experimentation and contribute to expanding the possibilities of interactive, audience-participatory works.

My practice continues to focus on projects with a strong emphasis on direct audience interaction, and future posts will explore other works developed with similar principles.

 

For more works, click here.

Comments