Working on a TD project with camschnappr and vive to dynamic map some cubes. Beginner to touch, but have previous experience with 3d engines and node based design tools.
I’ve got the wireless trackers sending to TD through openvr. I also have camschnappr working and mapped. Now I want to link the two to be able to move the box around.
So from my understanding, I would store the start position of the controller, then find the delta difference when I move it. I send this to a Transform CHOP and then apply it to an object? My question now is how do i actually implement this. I’m not quite sure what to feed into a transform chop and where the output of that should go. I’ve attached what I’ve got so far, which is an combination of multiple tutorials.
I want to be able to move the box, so what do I attach to the transform to.
Alternatively, if there is a way to setup the room to show the true 3d position and calibrate such that i dont have to use the difference in position but absolute position, that would also be awesome and better.
Hi! I am also having the same problem. I connected HTC Vive tracker through ViveController and then applied tx, ty and tz values directly to Geometry node, where I imported my model. Now the model (cube) gets rotated in CamSchapper and Window but not the way I want. And I failed to calibrate it no matter how hard I tried.
So I could you please describe how one could calibrate it with CamSchnapper?