Close
Community Post

Massively Distributed: Metacomposing

December 1, 2020

MASARY Studios

TouchDesigner is used to aggregate, curate, order, and render user submissions from the Massively Distributed webapp. As such, with TouchDesigner, we query a local database and sort the results based on various metadata criteria. We implement a sequencer within TD for playback and rendering of audio and video samples, including custom GLSL shader effects. Finally, with TD, we automate the process of rendering compilations of user submissions, including the creation of title + credit animations. The output from this TD-driven pipeline is loaded onto media players and shown publicly through facade projection.

 

MD'S DATABASE

Massively Distributed relies on a database backend for receiving and managing all of the user-created compositions (which are submitted through the web-app). For our database, we're using MongoDB, which uses a document model that is similar to JSON objects to store data. MongoDB is a bit more flexible than a SQL database, and in our case, we needed to retain the ability to change database entry fields as we developed the project.

 

USER SUBMITTED DATA

When someone using MD submits their composition, their device sends the following data to the MD API:

  • Title
  • Author Name
  • Geolocation
  • Composition
  • Composition Length
  • Tempo
  • Mixer State
  • Effects State / Recording
  • Sample Pack References
  • Version
  • Date/Time of Creation

These data fields constitute the user's composition in its entirely. Using this data, it's possible to recreae their composition, including any modified mixer levels and any effects processing.

The Title, Author Name, and Geolocation fields are optional. Title and Author Name are used to credit the composer as they wish when their composition is shown publicly. And Geolocation data is only used for our own analytics purposes, as of right now.

ADDITIONAL FIELDS

In addition to these raw values, the MD API calculates a number of additional fields that provide some higher level descriptors about the composition. These are all various metadata (data about data) that helps the curation/meta-composition process.

These additional values include:

  • Mixer Modified
    • (boolean) Indicates whether the user made any changes to the mixer from its default settings
  • Mixer State Averages
    • (Array) This is a set of three numbers (one for each sample pack) that indicate the average volume level for all samples within that pack
  • Track Counts
    • (Array) This is a raw count of the number of notes played from each sample in this composition
  • Pack Counts
    • (Array) This is a raw count of the number of notes played from each sample pack
  • Density
    • (Number) This is a floating point value denoting the total number of notes played over the total number of possible notes
  • Pack Ratios
    • (Array) This array has three floating point values (one for each sample pack). These three values sum to 1 and represent the percentage of notes played from each pack against the total number of notes in the composition.
      • I.e., if someone used on samples from pack 1, this would look like [1.0, 0.0, 0.0]; If someone had 10 notes in pack 1 and 30 notes in pack 3, this would be [0.25, 0.0, 0.75]
  • User Set FX
    • (boolean) If the user has modified any of the effects from their default (off) state, this will be true, otherwise false.
  • User Set FX Channels
    • (Array) This array contains four boolean values, one for each effects channel, indicating whether the user has modified that effect from its default state.
  • Color Scores
    • (Array) Particular samples were selected as being particularly Blue, Orange, or Green. Each color score is calculated as the number of notes within that color group divided by the total number of notes in the composition.

QUERYING THE DATABASE

With many user submissions in the database, we can query and sort them using any of the data fields they contain. This is the process that we've been calling "meta-composition"... composition with your compositions.

MongoDB has a really flexible and easy way to query a database. We construct a query using a couple objects: a filter object and a sort method.

We're using Touch Designer to compile and render out meta-compositions (more on that in a moment). TouchDesigner make extensive use of the Python scripting language: parameters can be linked using Python and you can script just about anything in Touch Designer by writing Python scripts. Therefore, for database querying, we're primarily using the Python MongoDB driver pymongo.

QUERY EXAMPLES

If we want to get ALL submissions:

allSubmissions = collection.find({})

And if we want to start narrowing our submissions based on some values:

someSubmissions = collection.find({'userSetFx': True})

someSubmissions now contains all user submissions where the user modified the effects processing.

If we want to futher hone our query, we can provide multiple key/value pairs to be satisfied:

someSubmissions = collection.find({ 'userSetFx': True, 'tempo': 60, 'compLength': 12, })

someSubmissions now contains only submissions that have non-default effects processing, are 12 beats long, and have tempo set to 60 bpm. Finally, we can also sort the returned submissions in any way we can imagine:

someSubmissions = collection.find({ 'userSetFx': True, 'tempo': 60, 'compLength': 12, }, sort=[('density', pymongo.ASCENDING)])

This does the same as before (the same query), but will sort all returned submissions by increasing density.

META COMPOSING WITH QUERIES

The meta-composition process begins with constructing one or more database queries (including sort functions). These queries will be used in order to create segments of the meta-composition.

With these queries prepared, we can make calls to the database from within Touch Designer.

TOUCH DESIGNER

 

Like I mentioned above, Touch Designer it makes extensive use of Python; it's possible to script just about anything in Touch Designer with Python.

This high-level view of the meta-composition network shows the general flow of data. The pink nodes, starting on the left, are responsible for querying the database and preparing all of the user-compositions. Green nodes are channel operators, and are used for individual number parameters. These nodes include functions for scaling values, as well as applying smoothing filters. The blue/purple nodes in the upper right are video nodes ("Texture Operators"). Data flowing into these video nodes will trigger sample playback, as well as apply effects and composite the final output.

Creating a meta-composition segment.

The custom node seen here, at left, is responsible for performing the query. Using this UI, we can construct our meta-composition segment (by pseudo-randomly skipping or repeating given compositions within the query response), and append the resulting segment to the complete meta-composition (seen at center-bottom).

 

All of the visual effects available through the web-app have been ported to Touch Designer as GLSL shaders. WebGL (which is based on OpenGL ES 2.0) has some differences from the OpenGL desktop API. They both use GLSL as their shader language, but there are some quirks and differences. We did our best to port everything faithfully between the two platforms.

One major difference is just in regards to the amount of processing power available. Mobile applications have access to far less memory and compute power than the desktop we're using to assemble the meta-compositions. Luckily, all our effects were written first for the web-app. Porting them to the desktop just eased all of the constraints and allowed us to think less about memory limits and more about making them look great.

Putting it all together, Touch Designer pulls all the user-submitted compositions necessary for playback and sets all necessary parameters. In this image (immediately above), the mixer-state can be seen at bottom-left, The current effects state is directly above that. At top-left, the composition itself can be seen, as a 2d array of True/False values, indicating where notes are played for each available sample.

The background of this image is the current meta-composition frame being rendered.

Jeremy Stewart

November 2020

 

More Meta Compositions can be found on MASARY vimeo and more info on Massively Distributed can be found HERE.

 
 

Comments