Maotik (Mathieu Le Sourd) is a French new media artist based in Montreal. He has been working on various projects using TouchDesigner, developing generative visuals and engaging participants with interactive environments. Last year, he focused on the full dome spherical environment at the SATosphere, Société des Arts Technologique. Along with electronic composer Fraction, he produced two immersive performances/installations, Dromos and ObE. Presented for the first time during MUTEK Festival last year, Dromos has been critically acclaimed and widely disseminated by the media.
Dromos- An Immersive Live Performance
"Get Lost in a symphony celebrating the science of speed"
CONCEPT
"In the field of science and technology that built the Western world there are only machines which accelerates. Whoever invents a machine to slow down will be considered as an absurd man"
-- Paul Virilio, 1991
The performance is based on the philosophical concept of "Dromology" (the science of speed), as described by theorist Paul Virilio, which questions the relationship between the speed of technological advancements and progress. We were inspired by his work as it combines philosophy and technology, and in 2009 we started to develop ideas about time and space. Our objective was to build an environment that would amplify the feeling of speed/acceleration, I built a maquette and thought of a space where a rig of light, projection and sound system would surround the public. The idea was to place the audience within the centre of a subtle interaction between sounds, lights and visuals in an ephemeral ceremony.
We started to work on Dromos in 2010 after receiving a grant from the CNC (Centre National du cinéma et de l'image animée), and had the chance to present a beta version in Rennes (France) in 2011 for the Electroni[k] Festival. That was before I moved to Montreal, and discovered the SATosphere and its capabilities. I submitted the project and started to work on an improved version which was presented last year during the MUTEK Festival.
3D REALTIME GENERATIVE VISUALS
I am interested in developing intuitive tools and interfaces that give me full control of different media, it gives me the ability to explore new form of experiences and languages. I have always been more interested in event-based visuals rather than timeline videos. Even if you don't reach the same graphics rendering quality with generative visuals it is compensated by the potential of iteration and the ability to improvise. In that way each representation can be different leading to a creation opened to a multitude of perceptions.
I started to learn TouchDesigner two years ago during a workshop Derivative's team organized for MUTEK Festival, before that I used to design my own VJ software using Max/MSP/Jitter and later on a combination of software inter-connected through Syphon. Now with TouchDesigner I have all I need.
SPHERICAL ENVIRONMENT
I was introduced to spherical visual techniques during a 16 hour workshop at the SAT. The most difficult part was to become familiar with the image distortion. In such space what you see is NOT what you get, realtime applications are really handy in those situations as you can immediately test and modify your content in real time. I can't imagine the process without it.
Building an environment with 3D geometry was the most appropriate as I wanted to focus on the architecture rather than the image composition. My main objective was to play with the perception of the space, modifying the physicality to give a feeling of destabilization. In order to make the spherical shape disappear, I designed a series of optical illusions that occasionally made the rounded ceiling of the theatre look like a room or an infinite hallway up to the sky.
LIGHT SYSTEM
In order to amplify the feeling of acceleration we use a DMX controllable system that surrounds the entire space with LED lights running on a circular trajectory at the speed of the music tempo, BPM. Lights position follow the sound spacialization giving a clear understanding of its circulation in the space. As all medias are connected, the lights also react to the projected images.
The result is a live performance where the public is surrounded by minimal and experimental sound textures converged and synchronized with sleek, dynamic visuals invoking the aesthetic of speed. Visuals are manipulated in real time within TouchDesigner and react to sound analysis, bass, beats and other sound effects. Dromos is a virtual architecture that can be simultaneously expanded and contracted, bent, and distorted in response to the evolution of the performance (curve tempo time based). Therefore, each showing will present an entirely new visual and sonic exploration. During forty minutes, the audience experiences a destabilizing environment where the media flux evolves slowly and the time curve is gradually deconstructed to the point of complete inertia.
8 phases compose Dromos:
Phase 1 : Parasite
A grid made of 3D instances comes slowly from the ceiling, waves reveal different faces of the cube textures.
Phase 2 : Time is Noise
A morphing cube made of noise digital textures that reacts to sound frequency.
Phase 3 : The Rise
3D tube instances rise up to the ceiling, sound sample visualisations are triggered sporadically.
Phase 4 : Forms
3D form moving in an organic way redefine the space physicality.
Phase 5 : Fall
The accident, elements are falling down on the ground letting a 3D landscape appear on the horizon where the lighting will give all the depth. The LED strip flashes at the position of the impact, creating a feeling of chaos.
Phase 6 : Acharnement
Glitches and decomposition, fragments of elements are distorted according to the sound attack
Phase 7 : Denouement
3D mesh lines come alive with the sound analysis, all elements get attracted by a force which creates a unique structure forming an organic ensemble. The sound envelope will modify the position, shape and size of the structure.
Phase 8 : Chaos
Audio reactive circle shapes, strobes and flashing lights close the performance.
INSTALLATION MODE
A few months later MUTEK Festival, We started a new residency at the SATosphere for one week, the idea was to develop the succession of Dromos as an installation mode. Having explored the theme of dromology for a long time, we decided to move on to something else. Controlling such immersive systems just with our iPad has been a unique experience as a user, it gave us the idea to set up some physical interfaces at the center of the dome that would enable the public to generate visuals and sound in real time.
ObE - AN IMMERSIVE INTERACTIVE ENVIRONMENT
ObE is a living digital organism whose erratic behavior is impossible to predict. The audience is invited to participate using sensors that cause the living organism to react. Operating like a theremin, each hand generates visuals and sound events leading the show into a live collective performance. The unique capabilities of the SATosphere yield an immersive audio-visual narrative in an impossible audio-visual space. Born from the interplay between audiovisual art and interactive installation, ObE is a UNIQUE sensory experience. Responsive generative 3D visuals take the viewer into a visual journey that is experimental and organic. The scenography boasts a circular interface equipped with infra-red sensors synchronized with a rig of DMX lights and a 3D mapping projection to make an unconventional orchestra.
3D INTERACTIVE MAPPING
Floor projection possibilities gave us the idea to reshape the space, by placing prism sculpture made of foam board at the center of the environment, we aimed to represent the core of the creature and a way to connect with ObE. We designed a 3D model and exported it in a software called Pepakura Designer that unfolds it and gives you the piece you need to cut off in order to reconstruct it. Then in TouchDesigner we used CamSchnappr to adapt the mesh and project onto it.
(Fred Tretout, Stefano Gemmellaro, Tina Salameh - work in progress)
Alternating between chapters pre-determined by the artists and interactive ObE moments, the dynamic 35-minute framework facilitates an exploration of the work's ambient ecosystem. The project is the place where minimalistic, pared down generative 3D visuals and 5.2 sound processing with electro-acoustic influences meet.
LEMUR - TOUCH CONTROL
For most of my projects, I make intensive use of the Lemur to control TouchDesigner and other devices (DMX, sound,..). The In-App Editor is very useful as it lets you design your templates on the fly without the need of a computer. Because of its wide choice of UI, (multiball, sequencer,..) and the objects physics, it becomes so simple to automate sequences and to use it as an animation tool.
PLANETARIUM RESIDENCY
Last January, AADN (an association from Lyon) invited us to a one-week residency in the planetarium of Veaux-en-Velin in France. I planned to perform real time visuals in the dome but due a lack of time and equipment, I had to forget about this option and so optimised my system to export pre-render content. I used the Record CHOP to save the dats on a external file and then reloaded it to execute the automatization.
30 YEARS APPLE
You might have noticed Apple's latest TV spot. The Apple team shot with I-phones all over the world various short films demonstrating in many fields, where the usage of Mac devices has changed and will transform everyday life. The Dromos project is showcased placing the live performance as one of the most cutting-edge and immersive projects. We were approached by an agency in Paris, telling us that Apple was interested in the way we were using the Ipad and wanted to feature it on their 30th birthday TV spot.
COMING NEXT
My next project is a new AV performance called Durations at the MAC for EM15 Festival (in Montreal), along with metametric, an electronic composer from Montreal. In Durations, the evolution of the rhythm influences the dimension of the environment when the sound granulation shapes the texture of geometrical instances.
Overall I am trying to keep working within immersive environments, I started to play with the Leap Motion, a 5.1 sound system and a panoramic projection screen aiming to push the limits of sonic visual exploration. My objective is to keep on exploring TouchDesigner to create an AV instrument for live performances. I like the idea of building new tools then learning how to use them, it's a powerful means of expression.