"Automation is an indispensable brush. A robot can be both a sculptor and a sculpture. It might not only serve as a creative tool but also emerge as a solo performer — an integral part of the artwork itself."
Derivative: What launched this series of works with robots?
Klim Sukhanov: Our initial interaction with an industrial robot began with the idea of attaching an animation laser projector to the "head" of a massive robotic arm manipulator. This allowed the robot to interact with this light source in a remarkably performative manner, spinning, moving, and directing the laser-projection animation as we wished. Afterwards, we began to delve deeper into this technology, exploring its potential applications and how we can use it in a "different" way.
Derivative: You make it look easy but undoubtedly controlling robots for artworks must have taken a lot of trial and hopefully not too many errors! Can you tell us a bit about this process and your discoveries?
Alexandr Sinitsa: To control the robot, we used a combination of TouchDesigner and VVVV software.
This setup allowed us to send data directly to the robot in real time and to control it using the same system we utilize for creating light and sound movements. In this way, it literally becomes one of our technological brushes.
However, it was crucial to first determine and set the robot's physical limitations to prevent any damage from a single incorrect move. We had to make all adjustments by experimenting with various combinations of robotic movements to fully comprehend its capabilities and limitations. Looking ahead, we did break a camera and lose a number of wires during these experiments.
Therefore, it's worth noting separately: all manipulations with robots should be conducted by licensed professionals only. Safety requirements must always be the top priority.
Derivative: What are your “freeze-light” experiments?
Semyon Perevoshchikov: For our second experiment, we connected "holographic" displays to the industrial robot by mounting LED fans on the "head" of the robotic arm manipulator. The interplay of moving images and robotic movements produced a freeze-light effect. As a result, the robot was able to "paint" with light in space.
Derivative: With the ROW series, can you explain a bit more how you are creating these layers of floating “holographic” images using fans with spinning LED strips?
Semyon Perevoshchikov: Multiple layers of generative visuals are synthesized in TouchDesigner, then individually sent to each holographic display. As a result, we get “sliced” floating 3D images. The generative visuals in TouchDesigner are synced up with sound-synthesis parameters, so when you see and hear it together, it feels more lifelike. A continuously morphing audio-visual object.
As for the sound, we connected a couple of desktop synthesizers to Ableton Live and routed them to TouchDesigner via OSC. Ableton sent the starting trigger to TouchDesigner along with all the subsequent automation data that created narrative shifts. This automation data was mapped to visual generative parameters in TouchDesigner and corresponding sound generative parameters on the desktop synthesizers.
Essentially, we created an instrument that simultaneously controls sound and light.
Derivative: ROW: Signals for Space celebrated the 60th anniversary of cosmonaut Yuri Gagarin's first human spaceflight with a special documentation of the installation that was shot-in-one take using an industrial robot moving in sync with realtime generative content emphasizing the parallax effect of five individually controlled LED fans. Can you explain how you achieved this please?
Klim Sukhanov: For this next artistic experiment, we hired an industrial robot as a cameraman, with its movements synchronized in real time using TouchDesigner and VVVV. The whole piece was shot in one take using the industrial robot, moved in sync with real-time generative content.
The robot sends real-time data (coordinates of its movements) to TouchDesigner, where visuals such as particles, are generated in three-dimensional space. So, when the robot changes its position, the same shift is applied to the generative content, allowing it to move in sync with the camera movements.
Derivative: For the latest evolution of ROW, INEXISTENCE you combined fans, a robot arm and a large LED screen to create a constantly-morphing video sculpture that was performed in both physical and virtual realms. Can you explain the making of this piece and how you are achieving the effect?
Tundra: For INEXISTENCE we aimed to merge "holographic" displays, an LED screen and an industrial robot into a single video sculpture to summarize our previous experience with these technologies.
With this setup, the screen functioned as a gateway to an enhanced dimension, so the artwork extends itself in the digital realm enabling the entire performance to take place simultaneously in both real and virtual spaces.
Sound and visuals were generated in real time using a combination of TouchDesigner and Ableton Live. A specially programmed industrial robot captured it on video with a camera mounted on its “head”. Robotic movements were choreographed and synchronized with the generative software, creating a mixed-reality experience using real-time camera tracking.
By incorporating a camera tracking algorithm, we could shift the visual perspective on the LED screen by changing the POV. Thus, the viewer could perceive the enhanced dimensions of the video sculpture in both the physical and virtual spaces.
But the final effect can only be observed on video due to the interconnection between generative algorithms and robotic camera movements.
Therefore, we decided to store the results on blockchain and create a limited collection of NFTs, where each token is a unique collectible video fragment of INEXISTENCE. The complete gallery can be found here.
Derivative: Can you tell us about the visualizer you created in TouchDesigner to test in advance how the setup and perspective-changes on the LED screen would appear.
Alexandr Sinitsa: The possibilities of TouchDesigner are endless.
I was able to create a visualizer to test and fine-tune in advance how settings and perspective changes would look on a combination of LED screen and "holographic" displays. This saved a lot of time.
TouchDesigner sent a trigger to the robot through its interconnection with the VVVV software, and the robot began to move, sending its trajectory data back to TouchDesigner. In response, TouchDesigner adjusted the parameters of the visual perspective (POV) according to the robot's current position in real time, effectively creating a custom "camera-tracking" algorithm.
Finally, the camera mounted on the robot's head captured these continuously transforming audiovisual objects that existed simultaneously in both real and virtual realms.
Derivative: What is next on your horizons and are there robots involved?
Tundra: Definitely, we're looking forward to exploring robotics more deeply. Automation is an indispensable brush. A robot can be both a sculptor and a sculpture. It might not only serve as a creative tool but also emerge as a solo performer — an integral part of the artwork itself.