Close
Community Post

[Experiment] On-AI-R #1: Camille - Complex AI-Driven Musical Performance

 

 

A complex AI live-style performance, introducing Camille.

In her performance, gestures control harmony; AI lip/hand transfer aligns the avatar to the music. I recorded the performance from multiple angles and mapped lips + hand cues in an attempt to push “AI musical avatars” beyond just lip-sync into complex performance control.

Tools: TouchDesigner + Kinect V2 + Ableton Live + Antares Harmony Engine → UDIO (remix) → Ableton again | Midjourney → Kling → Runway Act-Two (lip/gesture transfer) → Adobe (Premiere/AE/PS). Also used Hailou + Nano-Banana.

Not even remotely perfect, I know, but I really wanted to test how far this pipeline would allow me to go in this particular niche. WAN 2.2 Animate just dropped and seems a bit better for gesture control, looking forward testing it in the near-future. Character consistency with this amount of movement in Act-Two is the hardest pain-in-the-ass I’ve ever experienced in AI usage so far. [As, unfortunately, you may have already noticed.]

On the other hand, If you have a Kinect lying around: the Kinect-Controlled-Instrument System is freely available as of today. Kinect → TouchDesigner turns gestures into MIDI in real-time, so Ableton can treat your hands like a controller; trigger notes, move filters, or drive Harmony Engine for stacked vocals (as in this piece). You can access it through:
  / on-ai-r-1-ai-4-140108374  or full tutorial at:   • [Tutorial] Movement Controlled Instruments...  

Also: 4-track silly EP (including this piece) is free on Patreon.

Comments