The Human AutomatArt project was originally due in May 2019 for the Look Again Festival, co-commissioned by Aberdeen Performing Arts and New Media Scotland‘s Alt/w Fund with investment from Creative Scotland, but then sadly postponed because of the lockdown. It has been live on Aberdeen’s Music Hall‘s large screen from the 1st until the end of November 2021, and it's about to be showcased at the Datasphere exhibition at the Edinburgh Science Festival in April 2022.
In the present world, every human being has become a living set of data, and potentially a content creator. With Human AutomatArt, we transformed Aberdeen’s Music Hall into an information collector, gathering data from the human activities occurring there, via movement detection, sound and temperature sensors, whose data are generating the digital painting on the screen. Doing so, the unaware human beings become the autonomous agents whose behaviours create the structure for the generative art program, and the human life and related activities as a whole become the painting artist.
The project’s name Human AutomatArt is a mixing of the words-concepts Human, Cellular Automata and Art. Using the logic at the root of AI programming, we want to humanise this process, using the human beings as “cellular automata”, independent unities with different behaviours which interact with their environment and can generate an interesting emergent complexity, which is going to create a complex piece of visual art.
The idea flourished while we were reasoning around the digital world and the importance of data, and how to let interact the real-life analog world with the digital one in an artistic way. The use of sensors was the obvious choice.
We imagined transforming the Music Hall into a living harvester of data: using a webcam as a movement detector, an Arduino board with a temperature sensor, and an audio connection from the main hall, all connected via Ethernet cables to the main computer, we collect all those data that are driving the behaviour of the generative art program made in TouchDesigner.
People’s interaction works if they are completely unaware of how their behaviour is driving the program because one of the essential criteria of the generative art is that the system needs to be autonomous: independent of outside control, free of any guiding hand. If the human agents are acting unaware, or uninterested, in the effects their actions are having on the system, they become as valid a data source as any other autonomous object. What we want to accomplish with this project is to take a step back from the overwhelming technology and insert the human being in a process that is usually computer-driven, using the technology as a means, not as an end, with the human activity regaining central role.
Our latest projects all involve sensors; the autogenerative series The secret music of plants, which involves a plant connected to the modular synthesizer through biofeedback sensors, with the plant that is generating music through the synthesizer; Aletheia, that is Marta’s Master’s in Sonic Art’s final project, a quadriphonic interactive composition for custom-built EEG devices, which involves EEG sensors and Arduino to generate music in Pure Data using brainwaves. The “Human AutomatArt” project is the logical consequence of these works.
The technique used is that of the feedback loops we have already used in “Nocturne on Ganimede”, but enhanced and enriched, to generate this effect of 3D coloured ink that is coming out of the screen.