Close
Company Post

Walking on the Moon with Volvox Labs

In celebration of the 50th anniversary of the Apollo 11 lunar landing, Volvox Labs [VVOX] collaborated with Ogilvy to create the IBM Moon Walk experience exhibited at the iconic “Oculus”, the World Trade Center Transportation hub, in NYC. Just as IBM helped NASA get to the moon in 1969 by pushing the limits of technology, Volvox Labs did the same by bringing that historic moon walk experience to a new generation of space enthusiasts in a family fun, interactive experience on July 18th — July 20th, 2019.

Using TouchDesigner and wrnchAI the VVOX team built a responsive and agile software solution to help achieve this remarkable, gravity-defying experience. More than 4000 visitors stepped onto the IBM Moon Walk experience and were able to see themself as astronauts next to the Eagle lander on a massive horizontal LED wall. If that wasn't enough, each person’s physical footprints were transformed into digital moonboot prints on an immersive 24’ x 14’ LED floor. Visitors waved, jumped, danced and many practiced their acrobatic skills, but all were able to walk on and authentically experience the moon's surface dynamics and bonus: receive a souvenir digital postcard to share online! We stand in awe and had some questions for the team...

Derivative: Simulating the experience of walking on the moon as vicerally as your team did is no mean feat! What was the design brief and what challenges were immediately identifiable?

VVOX: The goal was to develop a photo-realistic, real-time rendering system combined with robust body-tracking within this very unique and busy location. The Oculus attracts well over 120 million visitors each year with many people traversing on their way to somewhere else. This constant influx of people coupled with the sun-filled ambient light was a challenge our team would have to solve in order to create the experience as realistically as possible.

The pipeline from the beginning presented a solid set of challenges that were both exciting and complex to solve. Our team of developers went through multiple iterations to find the best solution for achieving the lunar surface dynamics. With the help of TouchDesigner, we were able to build a responsive and an agile software solution to help achieve this goal.

Derivative: How did you achieve such responsive tracking in that very noisy and uncontrollable environment?

VVOX: First off, we needed to find a robust body tracking solution that would not only allow us to track groups of people but that would also allow for control over the interaction area. Our initial inclination was to use the Microsoft Kinect or Intel RealSense since we’ve used those in the past for similar purposes. However, the venue we were deploying in had a ton of sunlight shining onto our interaction space which we knew would degrade the infrared light used by the depth cameras. Furthermore the rotation of the sun throughout the day and fluctuating light in the venue would likely add more instability. The first tests with off-the-shelf devices proved the tracking would not meet our target goal for “realism”. To achieve a more realistic avatar for visitors the tracking would need to have more precise and stable inverse kinematics - "IK".

The initial prototypes led us to investigate machine learning models for pose-estimation. We tried a few of the options we found for openCV but eventually settled with an SDK created by wrnchAI. This product allowed us to quickly grab skeletal positions from any RGB camera. The new workflow opened up a lot of possibilities and integrations. One big win about using wrnchAI was that it already had a template to integrate the SDK within Unreal Engine. This meant, we were able to quickly get a demo going and start working on our main rendering and content pipeline. We had to ingest wrnchAI's ML (machine learning) workflow and integrate it into our CV, TouchDesigner, and Unreal Engine pipeline.

 

Derivative: And how did you simulate the moon's gravity in real-time interactions with the visitors?

VVOX: So after conducting tests our team decided that wrnchAI and its tracking ability was exactly what we needed to achieve the desired results. However, the out-of-the-box SDK still lacked the amount of control we needed to make the body tracking perfect. We needed to get our hands on the raw data streams coming from the ML model and further filter for the 1/6th gravity feel of the moon. It was at this key moment we knew we could rely on TouchDesigner to manipulate this amount of data while providing real-time feedback.

We created a custom Python application that would grab all of the skeletal data from the wrnchAI SDK and bring it into TouchDesigner. [Note that there is now a native wrnchAI CHOP available in the 2020.40000 series of builds.] The system filtered and augmented that information as needed for the experience. TouchDesigner was an integral part of this pipeline as we needed to create a low gravity feel, smooth the incoming data, and create a player ID tracker. All of these tasks were tricky but achievable thanks to the data manipulation tools available in TouchDesigner. Once all this skeletal data was treated it was piped back into Unreal Engine using JSON for astronaut puppetry.

Derivative: Getting the impression of moon boot footprints on the floor was a fantastic touch. Can you explain how you achieved this?

VVOX: The IBM Moon Walk experience consisted of a massive LED wall displaying the ‘mirror to the moon’ and an even bigger sensor-equipped LED floor. The vision for the floor was for participants to leave their lunar marks, just like Armstrong did 50 years ago. The tricky programmatic part was to figure out the rotation or size of an object (i.e a foot) covering the sensors. The team set out to reverse engineer the interaction from the floor and compensate for it using openCV. The floor’s sensor data, being a massive mouse pad with 64x64 sensors in each of the tiles, was ingested into TouchDesigner and rendered as a texture.

UsingTouchDesigner, we created a custom CV pipeline that allowed us to analyze the hull of an object in a texture and calculate its rotation and uv coordinates. Once the rotation was retrieved, the position and rotation values were sent to another instance of Unreal Engine where that data would be used to spawn a footprint with particles in each location someone stepped. This was awesome because wherever you stepped on the floor the Neil Armstrong lunar footprint would appear under your feet with correct orientation and Moon boot size!

 

Derivative: The postcard must have made visitors very happy!

VVOX: Well what would a trip to the moon be without a postcard to send home?! TouchDesigner was used to create a custom UI for each participant to send themselves a postcard from their trip to the moon. The backend communicated with APIs for social media sharing and cloud storage foreach piece of content.

Derivative: So the entire experience was automated so that you did not have to have any technicians running the system?

VVOX:  That's correct. To begin the experience brand ambassadors pushed a physical button on the side of the LED wall. This physical interaction communicated with TouchDesigner via a microcontroller and started the run-of-show. Two instances of Unreal Engine began a timeline for the photo moment, an audio timeline began playing back audio, and the captured output of the LED screen was sent to the postcard kiosks. With the use of TouchDesigner, we were able to roll all of these elements into one and create a seamless experience.

Ultimately, we had approximately 4000 participants through the 3 event days at the Oculus. The system ran non-stop allowing parents, kids of all ages to jump in 1/6th G and levitate like Yoda (our secret feature discovered on the first day :) )TouchDesigner again proved absolutely vital for developing stable and reliable projects involving high stakes and even higher foot traffic.

 

Credits

Agency: Ogilvy USA

Creative Direction: Kamil Nawratil

TouchDesigner Developer: Javier Cruz, Keith Lostracco

UE Developer: Mark McCallum

Motion Graphics: Eddy Nieto

CG artist: Steven Baltay

Cinematography: Why Not Labs

 

Follow VVOX Web | Instagram | Facebook  and read more about the wonder-team's fearless design and engineering feats on the Derivative Showcase THE AWARD-WINNING TESSELLATED TETRA SCULPTURE FROM VVOX, VOLVOX LABS ANIMATES DTLA MICROSOFT THEATRE LOUNGEVOLVOX LAB'S SUPER-CONNECTED HIGHLY-CONFIGURABLE DD OUTPOSTVOLVOX LAB'S DYNAMICAL SYSTEMS AND THE PERCEPTION OF CONSEQUENCE & INFINITY, HYPER THREAD & ORIGINS | THE LAB AT PANORAMA NYC

 

Comments