The crystal illusion in the Instagram video included experimentation with one of Zoe’s favorite sensors - the Leap Motion. Released in 2013, the Leap Motion has been an interesting sensor that’s been big on promise, but a little short on adoption. That’s not Leap’s fault, our collective notions human computer interaction have stayed strangely immobile despite an ever growing field of alternative elements. This particular sensor, however, has had a growing place in the VR market as a way to bring embodied experiences back into the headset.
To measure depth, the sensor uses two wide angle cameras and a bit of math to construct an assumption of joint locations out of the data. The resulting data is then available as stream of changing values that other applications can use. In TouchDesigner you can use the Leap Motion CHOP to grab this data stream and use it to drive lots of interaction elements.
Zoe used this particular sensor in a number of installations and explorations, and has kept experimenting with this input device. In their installation experience in {cantation} in {translation}, Zoe used the Leap Motion as an embedded sensor that invited participants to interact with a futuristic oracle.
In this set of videos - Zoe’s first online tutorials - they take a quick look at what data comes off of the data stream, how you might visualize finger tip location (videos 1-3), and how to achieve some 2D illusions using 3D data (videos 4-5).
/RESEARCH
Depth-Map Generation using Pixel Matching in Stereoscopic Pair of Images