Ok, I managed to make it work.
Here is what I learned: The Kinect SDK outputs quaternions in a very specific setup (which is the one you will probably find referenced in the microsoft forums and wich has given headaches to a lot of people). But from those, each programs uses its own implementation to get usable angles, wich will be dependant on the coordinate sistem of the given program among other things. The same is true with Touch: The Kinect chop outputs angles, wich means that internally the quaternions are being interpreted in a way chosen by Derivative.
That means that the skeleton requirements for TD is particular and different from other enviroments, and using a working implementation from anywhere else as a reference (including the official examples from Microsoft, or for example Unity or Unreal, wich is what I was doing) will be wrong and misleading.
So, here are the particulars for a working puppeteered skeleton:
1.- All bones are aligned on the Y axis pointing to their child (y is the roll axis).
2.- The coordinate sistem should be Y upwards, X to the right, Z to the back, so rigging this setup will be easier in C4D, because the coordinate system matches, than say in Maya, where it doesnt and you wont be able to get the axis rotations right.
3.- The whole spine works with the default Y alignment from cinema (Y aligned to the child, X to the right, Z to the back).
4.- The leg joints have a different rotation. The axis orientation of the femur and shin joints should be rotated 90 degreens on Y so that the Z axis points to the inner part of the leg. That means that for the right bones X is pointing to the back and for the left bones X is pointing to the front.
5.- The clavicle, forearm and hand joints are aligned so that Y is pointing to the child and Z is pointing backwards. For the right bones, X will be pointing up and for the left, X will be pointing down.
6.- The humerus is, fore some reason, a special case, as they are rotated 90 degrees on Y. Z should point down. On the right humerus, X should point back, and on the left, X to the front.
The hierarchy is:
Base bone (locaged at the ground, will get the translation information)
pelvis_l > femur_l > shin_l > foot_l
pelvis_r > femur_r > shin_r >foot_r
lowerback
upperback
neck > head
clavicle_r > humerus_r > forearm_r > hand_r
clavicle_l > humerus_l > forearm_l > hand_l
Im attaching a simple working skeleton as an example. It will work if you import it to the skel_rel.toe example from this thread, enable the export flag on the select and use chan name is par:parameter as the export method. The skinning is bad, as it was made as an example in a hurry.
Now, this still has some problems, and I hope someone can help me fix them.
1.- The orientation data is somewhat noisy. Since we are getting rolled angles from the Kinect chop, they cant be easily lagged or filtered. I dont know if internally the quaternions are being filtered, wich is the method most implementations suggest. If it is, I do believe the filtering could improbe on Derivative’s end, because good filtering for orientation would need the quaterions.
2.- If the kinect cant see your thumb, it wont get the hand orientation right, so it will filp the hand randomly, breaking the elbow. This is the most noticeable problem, as it happens any time you dont have an open falm clearly facing the sensor. I thing this could be fixed detecting if the hand is open or closed in the interaction data from the kinect chop, and defaulting the orientation if it is closed or if the hand is not tracked. I still havent had the time to implement this, if someone has a working implementation that you could share it would be great. This also happens to a lesser degree with the foot, breaking the knee.
3.- Most importantly I’m not getting good orientation data to the upperback, neck or head. So the spine is a solid line and the head cant rotate. The weird thing is that this happens too in the example shared by derivative. So it might be a bug in the kinect chop? Some orientations not being interpreted correctly? Or am I doing something wrong?
This last point is where I really need help, because the character looks really strange if it cant move its head. Has anyone managed to use the orientation data to rotate the head and neck?
Another issue is that when the user rotates and faces away from the sensor, the clavicles, arms and pelvis will correctly rotate, but the spine wont, so the head and chest will still face the camera. Maybe related to the last issue.
I hope this info is useful to someone and saves some time. I also hope that someone can point me to a solution to the issues I listed, in particular the last one.
character_test_corrected.fbx (324 KB)