"Like I was staring into space in a cave that was also a cathedral of light whilst in a submarine" - audience comment
Photo Credit: Paul Mobray
Photo Credit: Robert McFadzean
Visual artist / designer Jonny Knox and audio composer / performer Darien Britto premiered 'Remote Sense', a realtime fulldome performance at Glasgow Science Centre Planetarium for Cryptic and the FDUK Bienalle in November 2016. The visual work is an investigation of Lidar, point cloud data captured through laser scanners (remotes sensing), processed, animated and rendered realtime in TouchDesigner.
Journeying through an Irish woodland, Hang Boi cave in Vietnam, the Scottish brutalist masterpiece St. Peters Seminary, Beauvais Cathedral in France, a US industrial building and a 1950's Norwegian cinema, the work is a continuous fluctuation between recognised and abstract with the audience's perception left to fill in the 'spaces in-between'.
Taking inspiration from prehistoric artists who painted in 360 degrees, the work follows the shamanic/artistic process of entoptic phenomena, an altered state, universal visual experience linked to the central nervous system which is believed to inspire mysterious yet familiar geometric and abstract Neolithic art.
"The process of blurring reality to abstraction is really interesting to me;" abstract expressionist Ashille Gorky said, "abstraction allows man to see with his mind what his eyes cant see". We understand the world to be made up of atoms, we know this to be true, but cant see it .... or can we? Entoptic phenomena appears to be a process of breaking down visual reality to its raw, primordial state, so what better way to express this than through Lidar data which, by nature, senses the world atom by atom.
Photo Credit: Paul Mobray
Derivative: Can you briefly explain what Lidar data is?
Lidar is means of accurately surveying objects, buildings and landscapes through laser scanners which rapidly pulse laser light to a surface which records the time to return; each point is then saved as a position in 3D space. A panoramic camera sits above the sensor and maps colour data to each individual point. Modern scanners capture hundreds of thousands of points per second. We're going to see a lot more of laser scanning in the future as it becomes commoditised, current use is in the spatial and heritage industries.
Derivative: What is your attraction to this medium and how did it all begin?
Jonny Knox: In 2009 I first became aware of laser scanning through some of the ground breaking heritage work the 'Scottish Ten' were producing - a 3D model, yes, but untouched by human hands, we are seeing a digital replica, a simulacrum. I was fascinated, not by the still images of the data, but the potential of what it could become. I want to be inside the model with the freedom of movement and the 'experience' of being there and to finally manipulate the data to bring artistic presence to what is ultimately a highly objective medium.
Remote Sense Network
Laser scanning is still exclusive of course, i.e. the very expensive hardware and proprietary software are barriers to entrance but there are ways around this, here are mine:
- Have friends and contacts who are kind and willing to share data!
- Open scan data in the awesome and free 'Cloud Compare'
- Decimate the data to something your system can manage, in my case 10 million points
- Export the data to ASCII text format
- I then built a simple TouchDesigner patch which table formats X, Y and Z, R, G and B into a File Out DAT
- Then in a fresh network the DAT is converted to 2 CHOPs (one for point positions X, Y and Z, one for colour R, G and B)
- A GLSL TOP (with some support from Derivative) converts point positions to pixels with a resolution referencing number of points. Another GLSL does the same with colour data.
- Geometry is then instanced with position and colour values and rendered in fisheye mode.
Once the basic system is in place, the fun begins! There are many ways to go about this, such as blending between data, compositing multillayer effects, animated noise and ramps etc. I've spend many hours on this part to tease out interesting variations or entirely new structures. I think the key is to let the data tell its own story, follow its own path and even when obliterating a scan beyond recognition, I hope there remains a form of resemblance on a a deeper level. This is what Remote Sense is all about.
The show runs entirely in realtime however I couldn't completely get away from my CG background. There's a pre-programmed and keyed timeline skeletal animation over top of which I use a midi controller to live animate positions and materials. I used a Space Mouse to both pre-visualise camera positions and navigate in realtime. Rendering is through the 'fisheye' mode and as a vertex shader works beautifully with multi-millions of points.
I've uploaded a simple .toe to get people started with this exciting medium, please share your results, I'm still learning too! The scan data is courtesy of my good friend John Meneely aka 1manscan.
Derivative: Thanks very much Jonny! Last question: what's next?
Jonny Knox: I've a fairly eclectic background, starting out in music and DJ'ing, I somehow moved into architecture with a focus on CG art, I eventually fused these disciplines to develop immersive projection domes for shared VR experiences, now heading Creative Technology at Soluis designing immersion suites around the world with TouchDesigner playing a big part in this work on various levels.
My work with Remote Sense is still on-going and we've a number of requests to perform in 2017 and 2018! I'm considering bringing LEDs into the equation and this is the beauty of TouchDesigner, the ability to input/output a vast array of technologies quickly. I'm still just as in love with the dome as a medium as I was years ago, there's a real magic which can happen that requires a different, more holistic mindset to work with and one, I believe, which is still far from maturity. There's no place like dome!
Follow Jonny Knox and on Vimeo
<<back to blog