Vive Tracker

Hi!
I’m working with Vive trackers (without Vive headset) as a tracking system for objects and I’m really stuck trying to find the best way to calibrate their raw rotation/position according to the room.
In the room I defined an arbitrary “calibration position” which is a reference point shared by the room and its 3D representation.
When the tracker is placed onto this reference position, I would like to calculate the “offset” between the raw tracker values and my reference point. Then I could apply this transformation to the raw values in order to get my tracker moving in the 3d scene according to reality.

I tried several methods but none of it has worked. I guess I should use Object CHOP and Pre X-Form on Geometry COMP but I can’t find the right setup !

Could someone help me ?
calibration_test.5.toe (5.98 KB)

1 Like

I updated the example file with recorded data from OpenVR so everyone can see what’s happening inside without using Vive setup.
calibration_test.16.toe (109 KB)

1 Like

Hi,
I am trying to achieve the same thing. Were you able to figure it out?

In a rush, so I haven’t downloaded your file, but check out the python methods on the Geometry Ops: [url]https://www.derivative.ca/wiki099/index.php?title=GeometryCOMP_Class[/url], particularly computeBounds(), localTransform, worldTransform, setTransform(matrix), and relativeTransform(target). I used these methods for these sorts of relationships while working with my own Vive Trackers.

1 Like