Close
Community Post

Point Cloud Mastery from Think and Sense

Think and Sense produced a remarkable large-scale point cloud-based exhibit and media control system for an LED wall at the entrance of the new Tokorozawa campus of the media company Kadokawa Corporation near Tokyo. The entire end-to-end process other than the photogrammetry was created in TouchDesigner. "Ambience" is the proprietary system developed by Think and Sense where all the processes from real-time animation, visualization and image effects could be edited, scheduled and displayed, uniting multiple functions. The rich visual output changes dynamically based on weather, daylight hours, and other seasonal information obtained via a network API, so the complete experience reflects the changing seasons of real-world Sakuratown.
 

Derivative: Can you explain your process?

In the first stage we captured drone footage of the wide-area Tokorozawa Sakuratown and then we created point cloud data of the town and its surrounding area by photogrammetry with that footage. With that data we visualized over 16 million points in real-time using a proprietary point cloud visualizing system that is built in TouchDesigner. We constructed a real-time visualizing system that allowed the point clouds to be displayed and edited, which is a specialty of THINK AND SENSE. We pursued an approach for the best expression of 6K LED high-definition vision, setting the color palettes of “cherry blossoms” (part of the name of Tokorozawa Sakuratown) as a point of representation.

Derivative: Can you explain how TouchDesigner is used in the process of creating the work? 

Shuhei Matsuyama: The entire process was done in TouchDesigner except for the photogrammetry. The most significant point of our workflow is that everything is working in real-time.

Traditional movie creation usually separates the development processes. This time at the onset we created one scene of the whole Sakuratown point cloud that also included post effects. In the next stage we considered story, camera work and visual development. We quickly had high quality visualizations that were very close to the final look. I simulated several types of movement using only a 3D mouse to control cameras. This meant that there was less need for storyboards and movie editing, and also minimal need for keyframe editing to animate cameras.

Derivative: Can you please explain the photogrammetry workflow?

Shuhei Matsuyama: We took drone footage of Tokorozawa Sakuratown area including surrounding environments, and with those images created 3D-point cloud data by photogrammetry. The imagery was imported into Reality Capture to carry out a 3D-restoring with the multiple data sets, and then the point cloud files were output to TouchDesigner.

Reality Capture exports .xyz files. We created a converter in TouchDesigner from .xyz (ASCII) to .EXR (binary) files. The EXR files are read efficiently by the Point File In TOP in Ambience.

Derivative: How did you segment and organize the full point cloud data sets?

Shuhei Matsuyama:  The point cloud data is loaded into TouchDesigner TOPs (where each pixel represents one point's position and color) to be rendered in real-time. Since the point cloud data of Tokorozawa Sakuratown and surrounding area was huge, we divided this data into EXR files holding 4 million points (2048x2048) per set. All the points are loaded in memory by Ambience's pre-load function on startup.

Switching of point sets and transitions between scenes were choreographed in Ambience. Different combinations are turned on, animated and blended. Ambience runs 3 to 5 sets simultaneously which means we had up to 20 million points at 60 frames per second.

Derivative: How are you animating the turbulence and blending the points with the captured data sets? 

Shuhei Matsuyamaa: We are using custom GLSL shaders, point sprites and some of the new point cloud features of TouchDesigner. Our point cloud system has some proprietary functions focused on point expression: point size randomize, point color gradation based on camera distance and so on.

Derivative: What did you do to make the point clouds look as good as possible on the 6K LED display? 

Shuhei Matsuyama: The 6Kx2K LED system is very high-definition and has beautiful color. We tried different rendering styles and in the end we decided to use a simple post effect to enhance each point.

Derivative: What were the main guidelines that you received from the client before and during the project development? 

Shuhei Matsuyama: This 6K High-definition LED was the display product's first install in Asia. Our client wanted to claim this high spec and beauty. So point clouds were suited for their request. The output of the point cloud images and the color palettes are changing based on weather, daylight hours, and other seasonal information obtained via a network API. Thus, the changing seasons of real-world Sakuratown are depicted in the atmosphere of the visuals.

Derivative: It seems the client wanted something that untrained operators could adjust and schedule. Was that a significant part of the work?

Shuhei Matsuyama: Ambience is a general system we had built for multiple projects to leverage our growing experience. So for this tailor-made system for the Tokorozawa campus we added customized functions to control generative TouchDesigner .tox component files in the same UI and timeline as other media files. The most important requirement for untrained operators is to be able to choreograph media files and generative components just using drag and drop through the Ambience UI.

 

Timeline and Event List

Ambience enables the creation of playlists by dragging/dropping content and adjusting parameters. In addition, it enables not only video playback but also the reading of interactive content and working with video from external inputs. The Event List is used to schedule and time content playback.

 

Photo Cell Mode

In Photo Cell mode layouts with branding and graphics are arranged and can be changed dynamically at photo sessions or during events.

 

Window Designer

Window Designer lays out the presentation images on the displays at the desired locations. It is adaptive to variable resolutions and aspect ratios. The 6Kx2K LED is driven by one PC running Ambience. It supports additional event needs such as multi-screen displays for e-sports, and the system accepts external inputs from video capture hardware.

Derivative: Thank you Shuhei. This shows a practical workflow for getting large-scale point cloud data into a form that can be animated, adjusted and refined for a specific purpose, and then turned over to the client who will manage the operation and extension of the work, with an engaging, beautiful end result. 

 

Credit List

Technical Direction / Movie edit: Shuhei Matsuyama

Point Cloud System Design /Media control system design  :Takamitsu Masumi

Photogrammetry Engineering: Naoya Takebe

Sound Design: Intercity-Express (Tetsuji Ohno)

Media control system Design :Yuki Hikita

Media control system Graphic design:Yuki Soejima