Company Post

How SETUP Did a World Tour For the Red Hot Chili Peppers

With the release of their twelfth studio album Unlimited Love, iconic American rock band the Red Hot Chili Peppers launched their Unlimited Love World Tour 2022 with 46 sold-out stadium-capacity shows (thus far). For the two-hour show duration Chili Pepper fans were rocked and rolled through a transportive journey of legendary hits and new material and treated to an expansive and immersive light and video show so precisely executed we did not see a single frame drop! Central to the show's visual excellence was the idea of Lighting & Production designer Scott Holthaus that it be driven by generative video. Holthaus contracted Studio SETUP, designers of stages, installations and multimedia works of expansive architectural proportions in their own right to produce this content. We caught up with the SETUP team to talk about their experience working on a show this massive and learned how TouchDesigner was used for everything from generating the content itself to assembling the system in which it was generated and acting as an autonomous machine for working with the entire show.

Derivative: How did this project come about?

SETUP: In April 2022, the American rock band Red Hot Chili Peppers released their twelfth studio album Unlimited Love, which occupied the top of the Billboard at the time of its release. Along with the release of the album the band began their massive Unlimited Love World Tour 2022. 

Derivative: What is the World Stadium Tour?

SETUP: When SETUP was just starting to participate in this project, two parts of the tour were planned — the European and American parts of the tour. A total of 38 concerts from the beginning of June to the end of September, so approximately 2-3 concerts a week with a small 10-day break for moving from Europe to the USA. The band recently added 6 more shows on January 23 in Australia and New Zealand. 

Derivative: How is the tour organized?

SETUP: It's simple enough. First, the design and technical documentation of the show are created, equipment and structures are bought or rented, and the script for the show, video and light content are developed. Then there are the first test runs on the full setup, where the creative and technical teams have to work out all the processes of installation, connection, debugging, dismantling, so that during the tour time is not wasted - this is one of the most expensive resources. Then all this is packed together with all the necessary specialists and the tour begins. At each site the setup gathers on the day of the concert for 6-10 hours (time decreases with each concert), immediately after the show it is dismounted and goes on to the next city.

Derivative: How did SETUP get involved with the show? 

SETUP: RHCP approached our team. We were contacted by their lighting and production designer Scott Holthaus, who has been with the band for over 20 years. He really liked our work so we met, discussed and made a presentation. The members of the band liked it and in the end we were brought onboard. RHCP considered proposals from the best studios in the world, but for some reason they chose us — it seems that we were better suited for the punk level, we simply have no other explanations.

Derivative: What are you building for the Red Hot Chilli Peppers tour? 

SETUP: We created the main idea of ​​the stage together with Scott, developed the content, the show control system, the synchronization of video content with the light, and so on.

All the content for this tour was generative so that every track and every show was never exactly repeated, because a lot was created in realtime.

This caused a lot of trouble for programming and in general. We made more than 150 scene algorithms, each of which could make a whole show. This approach was largely dictated by the live performance of the musicians themselves who constantly improvise during the event. There is an optional set list but no script, the band can spontaneously jam and they can play shorter or faster tracks. It’s pretty crazy!   

Derivative: What is the design concept? Who are you designing it with?

SETUP: The first concept was originally suggested to us by Scott. It was a curved screen, similar to the one we did for the NECHTO event — a huge screen on the ceiling, curved towards the wall. Initially we were asked to make content for this screen but we wanted to do more, to come up with a show and improve this concept. A special floor and various bends at the bottom and top of the screen were added to the original concept. We also offered a content management system, generative content and more. So the scene was optimized To clarify, for the tour we came up with a central LED curved screen. This is the main part of the setup, as well as two side screens on which the artists are broadcast. All this consists of pre-assembled modules, which are then put together. The central screen around the perimeter has pixel diode devices with very bright diodes and a narrow zoom. These devices sync with the content and work great together. On the sides of the main screen there are four large matrices — large light squares, two matrices on each side. Each such matrix is ​​suspended on controlled computer winches, which have a whole variety of choreography and can be at different heights. They are also synchronized with the content. There are many front and side spotlights, a sidelight that also works with artists and a very powerful setup of light guns controlled by operators.

Derivative: What devices and technology are you using for this project?

SETUP: To be specific:

  • MAIN Screen: 13.2 x 36.6 m (RIG A)
  • SIDES Screen: 2 x 9.6 x 12 m
  • About 220 linear diode fixtures frame the main and side screens and work in sync with the content
  • 100 Magic Panel fixtures are located in four matrices on the sides of the main screen. They move on computer winches and are synchronized with the content
  • 40 Proteus Maximus instruments
  • About 90 diode stroboscopes, which are also synchronized with the content
  • 12 Followspots
We especially want to highlight our control system. Our developments are crossing the grandMA2 control system and TouchDesigner. All shows are controlled by the grandMA remote, including TouchDesigner. There are two grandMA consoles in the control room. First, the main one on which all the structures of the show, cue-lists, lights and so on are controlled. The second console serves as a video controller with which the operator works live and manages the content. Without such a lively and flexible management this would not be such an interesting show as it turned out. 

Derivative: Can you explain how you are using TouchDesigner?

SETUP: We used TouchDesigner for everything. With TD we assembled the entire system in which content is generated. There are pieces pre-rendered for very specific tasks, for example, content that is generated by a neural network. We made it in advance and reproduced it.

TouchDesigner works as an assembled media server and an autonomous machine for working with the whole show.

The content in TD is generated, in the same system it is controlled and receives a signal from the outside. It is immediately mixed and processed, all sorts of manipulations are performed there. There is also processing of incoming images from video cameras and applying effects to them, adding them to the overall mix and displaying them on screens. Technically speaking, the system is launched in four instances, two instances are mirrored, each of them has the same set of content, scenes are selected there. The third instance is responsible for receiving and processing the video signal coming from six cameras. All operations with the choice of the camera and the imposition of effects take place there. There is also a main instance, a master instance where all content is mixed.

Derivative: What is live and what is pre-rendered? Are there any live control during the show and by whom? Will the team be present at the shows?

SETUP: From the very beginning we had a set of very different created generative content which we repeatedly changed and updated. Many of these content scenes are built to be mixed and matched. It is not of particular importance that specific content exists for a specific composition as everything can change. We determine the connection between the track and content for ourselves very conditionally. We often experiment and change pieces of content between and during tracks. In different shows it turns out differently and we have a lot of improvisation. 

With the help of grandMA you can build a variety of content behavior logic - everything that will happen to it and how it will change, what will be programmed and what will be controlled live. 

At the early launch of the show there was always someone from the team constantly having to finish, make adjustments to the functionality of the system, debug everything and make sure that everything worked. The show is now halfway through its tour and no one from our team is present — it seems to work almost perfectly. During the event itself, the lightning designer (Scott HoltHaus), the person who controls live content (Joshua Upton, let’s call him VJ), and the system engineer (Leif Dixon). The engineer is in charge of controlling the control panels and ensuring that the entire system works correctly. The control system of everything is quite large, it includes not only light and content control, but also video cameras and all sorts of routing.

Derivative: What is the grandMA sending to TouchDesigner and how is live video worked into your TouchDesigner patches and the grandMA controls?

SETUPGrandMA sends a set of different parameters to TouchDesigner, which are transmitted via the sACN (Streaming Architecture for Control Networks) protocol which is a DMX protocol over IP networks using UDP. TouchDesigner knows this protocol well, so it is easy to set up communication between grandMA and TouchDesigner. In our case there are two logical blocks of these parameters (of which there are actually a lot). The first block refers to the media server part. It is responsible for choosing which screen to send content to, how to mix layers of content, or how to move content on the screen. Another block relates to content management: the choice of scenes, their speed and parameters. For example, there is a system of particles and using the parameters that grandMA sends to the AP, you can control the generation rate, direction, size and shape of these particles. During the performance of the composition the VJ through grandMA can control all of this.

Derivative: Describe a bit the TouchDesigner content you created for the show.

SETUP: The content was extremely varied. At first we started working in a figurative style, initially there were a lot of font composition and text generators or 3D models of people, but we soon moved away from this and moved towards abstraction. Show content is now dominated by various colored noises, abstract shapes, particle systems or shaders written in GLSL.

Derivative: What members of the TouchDesigner community contributed to creating content?

SETUP: Among the artists were Josef Pelz, Aurelian Ionus, Alexander Korneev, Arina Adanovskaya, Alexander Dzhezus, Pavel Zmunchila, Maxim Zebrev, Elvin Mamedov. We have been working with some of them for a very long time, others are already an important part of our team. There will be more content from artist pppanik coming soon.

Derivative: If you were approaching this project again from scratch what would you do differently?

SETUP: At first, when developing the content, we did not have a direct connection with the members of the group and for a long time we worked on figurative content with very clear images, but it turned out that the group did not see our content for a long time during the preparation of the show. As a result, at some point, all the content had to be radically redone. We made other mood boards and went into abstraction. We spent a lot of time and nerves redoing the content before the show, but Marcia Frusciante helped us a lot with different options. Thus, we had the right communication with the band.

Optimization is the most important thing we've learned from this show.

We also literally tried to reinvent the wheel and launched very utilitarian elements and a lot of utilitarian logic into TouchDesigner. For example, the distribution of content across screens or the creation of pixlemaps. We made a number of architectural mistakes in working with the TouchDesigner system, piling up a lot of utility logic and not optimizing a number of processes. Now we would probably leave only the generation and management of content in TouchDesigner and we would give all other decisions related to displaying content on the screen (and this is a fairly large amount of logic) to another additional media server. When adding utilitarian functionality, you inevitably encounter a lack of memory and optimization.

Derivative: What were the greatest strengths of TouchDesigner in building a show of this scale?

SETUP: The greatest strengths of TouchDesigner in this case were definitely the scalability and flexibility of the system. All the content created was easy to use on any setup and easy to rebuild as well as having the ability to optimize content and use very different tools. The same problems can be solved in many ways with different efficiency.

Derivative: What’s next for you?

SETUP: We look forward to the next big gig from Transmoderna in London and a new project at Nxt Museum in Amsterdam. And who knows where else it will lead us!



Creative Direction: Scott Holthaus, Aura T-09

Production Design: Scott Holthaus, SETUP

Show Programming: Rad Island / Leif Dixon

Creative Director:  Marci Frucianti

Generative Content Live Control: Josh Upton 

Video Live Stream Director: El Jefe Elizondo

Production manager: Narci Martinez

Lighting: Premier Global Production

Content Art Direction: Aura T-09, Daniil Kutuzov

Touchdesigner Programming: TRIX TEAM, SETUP

Content Development Team: 

Pavel Zmunchila, Josef Pelz, Forkni, Aurelian Ionus, Sinica, Arina Adanovskaya, Alexander Dzhezus

Ryabtsun, Elwvnn, Maxim Zebrev, TRIX TEAM