Company Post


Kinetisphere, an interactive installation designed and built by San Francisco-based creative engineering studio, Bot & Dolly to celebrate the launch of Google’s Nexus Q seemed to steal the show at the recent Google I/O 2012.

Based on what we've seen it certainly is memorable: An industrial Kuka robot deftly articulating an 8-foot, 300-pound diameter fiberglass replica of the Nexus Q with a visualizer ring of 6 mm pitch LEDs controlled by a signal coming out of TouchDesigner that’s reacting to the music as well.

Bot & Dolly CEO Jeff Linnell explains that the Kinetisphere is intended to emphasize and invoke the social nature of music where to get the best results from the installation people have to work together with friends and teammates to make the most interesting mixes.

Linnell is also co-founder of Autofuss, B&D’s sister company and an interdisciplinary design studio that focuses on motion design, animation and live action. Linnell first used TouchDesigner when he found it to be the perfect tool with which to drive another Kuka’s robotic arm in real time to produce an interactive 3D light-sculpture made up of over 54000 different points of light as part of a viral campaign for the game Halo: Reach.

Here is a concise description of the Kinetisphere's design and intended use from Bot & Dolly:

Nexus Q is a streaming media device that allows users to play music out loud. The Kinetisphere is designed to elevate and amplify that experience.

It consists of a large Kuka industrial robot wielding a giant sphere representative of the Nexus Q. Users are invited to control the installation via three input stations powered by Android devices. The three devices, Nexus Q, the Nexus 7, and the ADK 2012, provide IO for the user. The centerpiece of the control console is the Nexus Q, which serves as the primary interface to the robot. Manipulating it articulates the robot, moving it through space as well as mixing the accompanying sound and visuals mix in real-time.

The mesmerizing audio visual experience is entirely user controlled and in the spirit of the Nexus Q, invites user participation. In order to create the most dramatic motion, sound and visual experience, multiple users must coordinate their actions. The installation would not have been made possible without the experimental aspirations of the Android team and the open spirit of the platform itself.

Phil Reyneri (a TouchDesigner consultant on numerous projects including the Coachella Gateway and Skrillex Cell tours) explains how TouchDesigner was used for the Kinetisphere:

Touch Designer was used to create the audio-reactive visualizer ring, which also received input from one of the Nexus Q stations allowing the users to modify paramaters of the visuals in real-time. The visuals would also reflect the user's proximity to a 'hot-spot' goal in 3D space, intensifying as they approached the targets with the robot. 

Additionally, Tarik Abdel-Gawad of Bot & Dolly "used Touch Designer to create an augmented reality display showing participants their progress in reaching the 'hot spots.' 3D visuals were created in Touch that displayed game score and other important stats for users, and then calibrated to overlay a live camera feed. The AR display was then piped to multiple screens around the installation."

Exciting work that points towards an intriguing future! We look forward to seeing much more from Bot & Dolly and till then, read more at Engadget and watch their interview with CEO Jeff Linnell.