TouchDesigner technologists Simone Murtas and Graziano De Vecchis designed and built the real-time control system in TouchDesigner, and Derivative supplied a custom Pan Tilt CHOP to keep every fixture locked to the robot’s path in real time. We had the opportunity to speak to the Solar team about the art, the engineering, and the collaboration that brought Solar to life.
Derivative: How did this artwork come about? What were the inspirations and early intentions and how did the evolution play out?
Team Solar: Solar was born from our encounter with Quayola and his research on the luminous conditions of twilight and the optical phenomena that occur between sunrise and sunset. Quayola presented us with a powerful idea: transforming crepuscular rays—those beams of light that appear to radiate from a single point due to perspective—into an artificial luminous structure, sculpted in space through the movement of an industrial robot and hundreds of synchronized beams.
From the very beginning, it was clear that this wasn’t simply about “moving lights” but about creating a true digital luminous matter that echoes the cyclical nature of the sun, its trajectory, its shadows, and its penumbra.
The diurnal arc—the apparent parabola of the sun across the sky—was translated into the movements of the KUKA: imaginary orbits and lines drawn in space. From this poetic vision, the technical work began: building a system that allowed the robot and the beams of light to speak the same language. Solar evolved as a joint exploration between art and engineering, where every gesture of the robot becomes a luminous vibration, a chiaroscuro, a dynamic sculpture made entirely of light and shadow.
Derivative: A KUKA robot and a large array of moving heads perform as technological “actors,” and orchestrating their choreography at this scale was no small feat. What led you to cast these specific tools to achieve the artwork—transforming the sun’s journey from day into night—and what were their qualities that made them right for the role?
Team Solar: The KUKA was chosen for its ability to generate precise, ritualistic, almost liturgical movements. It’s a mechanical arm born for industry, but in its gestures it takes on an almost human physicality: it can trace curves, orbits, and inclinations that recall the sun’s diurnal arc. It becomes the central point from which everything radiates. The hundreds of moving heads represent the essence of crepuscular rays: parallel beams that, due to perspective alone, appear to diverge. Their modifiable nature—intensity, color, aperture, speed—allows us to create the simulated atmospheric conditions Quayola was seeking: mists, chiaroscuro, gradients, shadows, and subtle transitional zones typical of natural light phenomena. The combination of robot and moving heads makes it possible to translate an atmospheric phenomenon into a technological ritual. The robot doesn’t just guide the beams: it is guided by them. The whole system becomes a luminous polyphony, where each light is a note and the KUKA is the conductor.
Derivative: Could you tell us about your team, outline your prior experience with TouchDesigner, and describe the specific roles each of you held on the Solar project?
Team Solar: We are Simone Murtas and Graziano De Vecchis, two programmers and visual designers with different backgrounds but a shared vision: bringing visual language into the realm of real-time. We’ve been using TouchDesigner for years—for us, it has become more than software; it’s a way of thinking. In Solar, our role was to develop the lighting-control system and its integration with the robot. We built the entire framework in TouchDesigner, from pan/tilt calculations to on-site physical calibration. We worked closely with Alessandro Petrone from Quayola Studio, who developed the Ableton Live component, and with Quayola, who conceived the project, curated the artistic concept, and directed it overall.
Derivative: Can you walk us through the ten-minute performance from a story and technical standpoint?
Team Solar: The performance is a poetic compression of an entire day: a continuous sequence of digital sunrises and sunsets. The narrative unfolds through cycles in which light emerges from darkness, expands, multiplies, fragments, and then retreats—just like real twilight conditions where light and shadow alternate with heightened drama. The lights don’t simply follow the robot: together they generate geometric architectures of light, perspective structures, and dynamic patterns reminiscent of crepuscular rays.
Chromatic variations, flickers, penumbra, and optical effects are all driven by algorithmic simulations that modulate the “luminous matter” of the piece.
The result is a luminous symphony: the robot traces invisible orbits, the lights respond by creating structures that seem to expand in space, and the entire environment becomes a continuously shifting atmospheric landscape.
Derivative: You approached the Derivative development team because you needed a solution—to drive a large number of pan/tilt spotlights in real time, 200 in Rome—that wasn’t easily handled. What techniques had you originally tried and what did the final solution entail?
Team Solar: At the beginning, we tried everything: TouchDesigner’s inverse-kinematics system and Bones components, and then a Python script that calculated pan and tilt mathematically. The problem was that both methods produced unexpected rotations and numerical instability, especially with fast movements or dynamic targets. When we spoke with Markus Heckmann from Derivative, we discovered they were internally developing a new Pan Tilt CHOP. This led to a direct collaboration: we tested the system under real-world conditions, and they refined the calculations.
The result was an extremely precise system capable of handling hundreds of fixtures in real time while maintaining perfect sync with the robot and audio.
Derivative: From the calibration side of things, while the installation is developed with a virtual model, how do you prepare when coming from an ideal virtual world to the real physical space where offsets in how lights are mounted will impact the tracking of a moving object? Did you develop a calibration technique beforehand that allows for fairly quick calibration during setup?
Team Solar: This was perhaps the most complex and underestimated part. When you mount hundreds of fixtures in a real environment like the Gazometro, even a small error of a few degrees becomes enormous in the final result. So we developed an internal calibration tool: using known points in the stage, the system automatically corrects each fixture’s mounting offset. In practice, just a few minutes are enough to realign the entire real-world system with the 3D model, even under tight setup conditions.
Derivative: How do you convert the KUKA arm’s 3-D coordinates into TouchDesigner space, calibrate beam convergence, and output precise DMX pan/tilt values on-site?
Team Solar: The robot and TouchDesigner share the same 3D reference system, derived from Alembic files exported from Maya. The target positions are converted into absolute coordinates and then transformed into each fixture’s local space.
Our system doesn’t just point to single targets: it can generate complex geometric targets—circles, spheres, grids—and dynamically assign groups of lights to compose these geometries in real time.
Fixtures can be dynamically assigned to multiple simultaneous targets, distributing motion and creating ever-changing luminous configurations. The engine can also move along the Z axis, following 3D trajectories even for a single target, creating effects of depth and spatial rotation. Finally, pan and tilt values are calculated with millimetric precision and sent via sACN/ArtNet, ensuring perfect alignment between simulation and reality.
Derivative: What methods keep latency ultra-low—and safely contained—between Ableton OSC triggers, TouchDesigner processing, and DMX, ensuring perfectly timed, audience-safe beams?
Team Solar: The key was an optimized OSC pipeline and a modular process structure in TouchDesigner. Each macro receives OSC signals directly from Ableton, avoiding intermediate layers. The system operates entirely in real-time floating-point, with minimal buffering and a shared clock.
Derivative: How scalable and reusable is your TouchDesigner framework for other venues or larger fixture counts, and what native features do you still wish TouchDesigner offered for light-robot integrations?
Team Solar: The framework is fully scalable: we can control just a few lights or entire stages with hundreds of fixtures. Everything is parametric and can be reconfigured automatically by reading scene data (MVR files or 3D coordinates). TouchDesigner allowed us to create something alive and adaptable. If we could wish for one more thing, it would be native support for 3D fixtures and industrial robotics, to further reduce the gap between simulation and the physical world.
Derivative: Do you have other Solar performances scheduled?
Team Solar: We’d absolutely love to. We’re already discussing new international destinations, but for now we can’t reveal much. Every new installation is an opportunity to push Solar’s technical and poetic language even further.
Derivative: Simone and Graziano, what is next for you?
Team Solar: Let’s just say the next project will be a large-scale interactive experience—a big stage production where the audience becomes an active part of the choreography. It’s still top secret, but we can say it will be something never seen before. After Solar, our goal is to continue building bridges between light, movement, and perception. That’s what we do best.
Follow Quayola Website | Instagram
Follow Simone Murtas Website | Instagram
Follow Graziano De Vecchis Instagram
















