Company Post

The Future of Music | Making of a 360 Film

"THE FUTURE OF MUSIC" is a ground-breaking 360° virtual experience made by director Greg Barth and Phenomena Labs that immerses the viewer in a surrealist musical space.

The point of view is that of "interviewer" in a behind-the-scenes mockumentary with Carré Bleu, a revolutionary music inventor who is in the midst of presenting his unique production methods for recording an instrumental track.

The very amusing and clever part is that each of these instruments is an actual human turned into a gravity-defying "GIF-strument". The Kick is a checker-board-latex-suited-character with spikes on his derrière falling back onto confetti-filled balloons - twice! The Synth combines a wacky combination of vacuum cleaner and toilet rolls; the Snare has the chief sound engineer being pushed through a colorful sheet of real sugar-glass and so on…

The techniques and measurements were critical to realizing authentic results so the team developed an in-house 3D pre-visualization of the room into which they placed the virtual camera and proxy 3D actors to gauge room dimensions, light positions, and the size of props as compared to an actor's height.

Phenomena Labs developed a real-time live stitcher in TouchDesigner using the lens distortion matrix from PTGui. Live stitched video was composited with recorded stitched video to create the the 360 environment. This custom setup enabled the director to visualize previous takes over top of what was currently being filmed using TouchDesigner and Oculus Rift.

Director Greg Barth was able to get the most precise performance from the actors by seeing the results immediately on-set with the Oculus DK2 headset. This filming technique was also valuable in post-production to loop each individual actor's musical timing while the director looks around the room.

Prior to the shoot a custom TouchDesigner setup was made by Ronen Taschen of Phenomena Labs to handle real-time stitching and compositing where the different takes could be overlaid on top of each other allowing the team to get the right timing and performances from the actors.

The gear was selected to enable the team to run TouchDesigner on a single laptop with only two camera inputs. The action was captured with a custom camera rig consisting of two rib-caged GoPros with Entaniya fisheye lenses, feeding video into TouchDesigner via two Blackmagic HDMI capture cards.

Live stitching based on PTGui was calibrated at the start of the shoot. Phenomena Labs added the ability to rotate around the equirectangular frame in all 3 axis and compose shots with preview video files captured from TouchDesigner in HAP format. Garbage masks were drawn live in After Effects into TouchDesigner via Spout (Siphon on Windows).

The TouchDesigner setup had a simple custom control panel user-interface which Ronen operated while simultaneously operating and modding TouchDesigner networks.

To enable Barth to direct and synchronize the actors' performances as if they all were filmed at the same time, the Movie File in TOP was used to scrub, cue and play multiple takes of each actor's shots. An optimized GLSL shader TOP mixed the multiple sources.

In addition to driving the Oculus Rift, TouchDesigner broadcast to an iPad for on-set monitoring.