Login | Register 
FEATURES
APPLICATIONS
DOWNLOADS
EDUCATION
BLOG
WIKI
FORUM
STORE

 

s


  Apr.10.18 D-Brane, an Augmented Reality Performance

s
 
z
 


New media artist Harvey Moon "creator of tools and machines"wields outright wizardryusing the HTC Vive for motion trackingin this performance that is the "culmination of dance, real time projection mapping and live audio synthesis". Thestory of a creation outgrowing its creator, D-brane sees adancer wearing a tracking device interacting with a moving, suspended dodecahedronthat is also tracked and projected-upon. It's as uncanny as it is beautiful and has us glued to the screen hitting the replay button over and over again.

A software system was built specifically for the performance allowing the dancer to have direct gestural interactions with the visuals and audio in real time. An analog patch-based audio synthesis sonifies the motions of the dancer using the live tracking data. All real-time graphics, tracking and projection were realised in TouchDesigner. Big thanks to Mr Moon for shedding some light on how all this ethereality was achieved!


Harvey Moon: We built a pipeline in TouchDesigner that connected multiple tracked objects. Three HTC Vive trackers were used, one connected to the sculpture another on the dancer and the one on the camera. A custom calibration technique was built with openCV in TouchDesigner 099 that used the trackers as reference points and allowed us to calibrate the projectors to the Vive coordinate space.

A dynamic render pipeline built in TouchDesigner allowed us to specify presets for different materials and effects that could be toggled instantaneously. Each scene element was organized in their own base, an Op Find DAT allowed the system to discover and list all of the potential scene elements. A user interface was designed around connecting and disconnecting different scene elements, geometries, lights, cameras easily while on site.  A Node package was built to drive all the custom parameters and add or disable any potential scene element from a tablet web interface. The real time data from the dancer and sculpture was put through different gesture analysis before being output to an analog synthesizer that translated the data from TouchDesigner CHOP's into sound.

Controller UI with dancer's trackers history paths. The UI on right side shows all of the inactive possible scene elements and on the left are all of the active elements.
All possible scene elements are automatically discovered inside these bases.

All of the real time visuals were produced with TouchDesigner. A tracker on the camera allowed us to do dynamic fixed perspective effects. The motions of the dancer such as speed, pause, distance and rotation were used to alter parameters of the visual effects during the performance. 

This piece was a close collaboration between three different artforms. Our process was extremely dynamic. TouchDesigner was pivotal in building dynamic pipelines that allowed for quick experimentation without needing to compile. The instant pace of collaborative exploration was only possible using such a dynamic tool as Touch.

Credits:
Performer: Kathryn Florez fullstopdance.weebly.com/
Director: Harvey Moon HarveyMoon.com
Story and Art Direction: Qianqian Ye qianqian-ye.com/
Sound Artist: Cullen Miller pointlinesurface.com/
Technical Support: Colin Parsons robot.yoga/

New media artist Harvey Moon works with emerging technologies finding new and creative ways of connecting people to the world around us. Using electronics, mechanics and software the works straddle mediums while opening new insights into our connection to art and technology. He received his BFA at the School of the Art Institute of Chicago.

Follow Harvey:

Harvey-Moon.com
Instagram
Vimeo

<<back to blog

 

s

 SHOP & DOWNLOAD
Download
Store
CONNECT WITH US
Like Us
Follow Us
Watch Us
Instagram Us
COMMUNITY
Blog
Forum
Wiki
Education
COMPANY
Partners
Privacy
Terms of Use
Jobs
Contact Us