Drive a skeleton with kinect

The rotation channels are named for the bone whose joint as at the start of the bone. So the ‘shin’ bone’s rotation will be your knee. And the ‘upperarm’ rotation will be your shoulder. The reason they are named like this is for the joint that are forks such as the hip and the shoulder center. Those need 2 or 3 rotations, one for left and one for right (and neck for the shoulder center), so to make the naming of everything consistent they are instead named pelvis and clavicle instead.

The heirarchy is root → spine1 → spine2 → neck/clavicles. That is, neck/clavicles are attached at the end of spine2.

thx Malcolm…is there any graphic reference on that …just to avoid binding the wrong parameters.

does the child bones pivot change also in the kinect CHOP compared to his parent like what they say at the developers center of microsoft ?
IC584403.png

That .toe has been a great help in understanding how to use nested geometry in TD, thanks. The issue I’m finding is I can’t determine the xyz co-ordinates of comps at the bottom of the chain. For example if I want to use a moving skeleton as a collision object for a particle system how is this achievable? I’ve tried adding all the co-ordinates and rotations together for the whole chain but the results have been disappointingly random :confused:

Use an Object CHOP to get the final position of a joint after all the rotations have been applied.

Hello!

Has anyone managed to get a working pupeteered character driven by the relative rotations given by the Kinect CHOP?

I am tryining to understand without success the way the rotations are calculated. From what I understand, all bones have their axis aligned along the Y axis (because when all rotations are set to 0, the bones point upward). But from all the diagrams that I have been able to get, it seems that each joint has its own orientation, different from other joints, so each axis is pointing at a different direction excepting the Y axis.

I have found this link to be real useful, but I still havent managed to make it work:
[url]Meaning of Rotation Data of K4W v2

Does a working example of a pupeteered skeleton exists?

Thank you.

Ok, I managed to make it work.

Here is what I learned: The Kinect SDK outputs quaternions in a very specific setup (which is the one you will probably find referenced in the microsoft forums and wich has given headaches to a lot of people). But from those, each programs uses its own implementation to get usable angles, wich will be dependant on the coordinate sistem of the given program among other things. The same is true with Touch: The Kinect chop outputs angles, wich means that internally the quaternions are being interpreted in a way chosen by Derivative.

That means that the skeleton requirements for TD is particular and different from other enviroments, and using a working implementation from anywhere else as a reference (including the official examples from Microsoft, or for example Unity or Unreal, wich is what I was doing) will be wrong and misleading.

So, here are the particulars for a working puppeteered skeleton:

1.- All bones are aligned on the Y axis pointing to their child (y is the roll axis).
2.- The coordinate sistem should be Y upwards, X to the right, Z to the back, so rigging this setup will be easier in C4D, because the coordinate system matches, than say in Maya, where it doesnt and you wont be able to get the axis rotations right.
3.- The whole spine works with the default Y alignment from cinema (Y aligned to the child, X to the right, Z to the back).
4.- The leg joints have a different rotation. The axis orientation of the femur and shin joints should be rotated 90 degreens on Y so that the Z axis points to the inner part of the leg. That means that for the right bones X is pointing to the back and for the left bones X is pointing to the front.
5.- The clavicle, forearm and hand joints are aligned so that Y is pointing to the child and Z is pointing backwards. For the right bones, X will be pointing up and for the left, X will be pointing down.
6.- The humerus is, fore some reason, a special case, as they are rotated 90 degrees on Y. Z should point down. On the right humerus, X should point back, and on the left, X to the front.

The hierarchy is:

Base bone (locaged at the ground, will get the translation information)

pelvis_l > femur_l > shin_l > foot_l
pelvis_r > femur_r > shin_r >foot_r
lowerback

upperback

neck > head
clavicle_r > humerus_r > forearm_r > hand_r
clavicle_l > humerus_l > forearm_l > hand_l

Im attaching a simple working skeleton as an example. It will work if you import it to the skel_rel.toe example from this thread, enable the export flag on the select and use chan name is par:parameter as the export method. The skinning is bad, as it was made as an example in a hurry.

Now, this still has some problems, and I hope someone can help me fix them.

1.- The orientation data is somewhat noisy. Since we are getting rolled angles from the Kinect chop, they cant be easily lagged or filtered. I dont know if internally the quaternions are being filtered, wich is the method most implementations suggest. If it is, I do believe the filtering could improbe on Derivative’s end, because good filtering for orientation would need the quaterions.

2.- If the kinect cant see your thumb, it wont get the hand orientation right, so it will filp the hand randomly, breaking the elbow. This is the most noticeable problem, as it happens any time you dont have an open falm clearly facing the sensor. I thing this could be fixed detecting if the hand is open or closed in the interaction data from the kinect chop, and defaulting the orientation if it is closed or if the hand is not tracked. I still havent had the time to implement this, if someone has a working implementation that you could share it would be great. This also happens to a lesser degree with the foot, breaking the knee.

3.- Most importantly I’m not getting good orientation data to the upperback, neck or head. So the spine is a solid line and the head cant rotate. The weird thing is that this happens too in the example shared by derivative. So it might be a bug in the kinect chop? Some orientations not being interpreted correctly? Or am I doing something wrong?

This last point is where I really need help, because the character looks really strange if it cant move its head. Has anyone managed to use the orientation data to rotate the head and neck?

Another issue is that when the user rotates and faces away from the sensor, the clavicles, arms and pelvis will correctly rotate, but the spine wont, so the head and chest will still face the camera. Maybe related to the last issue.

I hope this info is useful to someone and saves some time. I also hope that someone can point me to a solution to the issues I listed, in particular the last one.
character_test_corrected.fbx (324 KB)

Continuing the tests on neck and head rotation, I notice that if I scale the values for those rotations for, say, 20, then I can see that they area actually moving, but the direction of the rotation does not correspond to reality.

For example, if you turn your head on its Z axis, so that your ear touches your shoulder and wich in theory should be a turn on the relative Z axis, you will notice that the value that is actually changing is the Y rotation, and the character on screen turns it head to face sideways. If you face sideways the character’s head will turn down.

So it appears that there may be a scaling problem and also the rotations are wrongly assigned.

If I realign the bones to try and match the changes that i’m seeing with their axis, the starting possition of the head will be wrong (because the Y axis must be the roll axis).

Could someone from derivative confirm if this is the intended behaviour of the node?

Hello,

I solved the noise and flipping bones issue implementing a quaternion One Euro Filter to each rotation. The flipping still occurs, but its smoothed enough as to feel part of the interaction. It feels much more natural now.

I am sharing in the attachment a first version of the of this quaternion one euro filter. The implementation was based on the VRPN library ( [url]https://github.com/vrpn/vrpn/blob/master/vrpn_OneEuroFilter.h[/url], ported from c++ to TD chops), but I did replace the slerp with a quaternion nlerp for speed, as it yelds great results in real time and is less computationally expensive than slerp. Please note that the filter expects four channels named x y z w for each quaternion or else wont work, so you must rename when inputting and outputting the values.

Im also sharing an implementation example showing a full skeleton with filtered rotations.

More information on the One Euro Filter: [url]http://www.lifl.fr/~casiez/1euro/[/url], keep in mind that most of the examples there are good for filtering floats, but not vectors or quaternions.

Now, the issue with this filter is cook time. Right now it’s taking around 30ms to filter a full skeleton, wich is usable but unacceptable as it leaves practically no room for furter complexity in the network. Running the filter in a separate td process helps, but not by much. A C++ chop implementation could also be possible.

As a suggestion, incorporating rotation filtering to the Kinect Chop itself would be awesome (see the VRPN code for reference).

There are probably a lot of optimization opportunities I missed in my tox; if you find one feel free to point it out.

I’m still having problems with the character not being able to rotate its head and upper back, and not being able to turn around.
filteredPuppetExample_v1.zip (207 KB)
quat_oneEuroFilter_v0.2.tox (9.4 KB)

Nice work!
Here is my project example:

has this file worked for anyone? I am still looking for a solution if anyone has tips or suggestions, thanks in advance.

Hey all,

Any success using 'filteredPuppetExample with a kinect 1?

struggling!

First, thanks for sharing all this information.
I’m looking at the rotations for the neck and head and they just seem to never have anything much different from the upperback when getting the raw absolute values from the Kinect library. I’ve posted a question on the forums about it to see what’s going on there.

I’ve now added a joint rotation parameter for the Kinect 2 that is a simple quaternion slerp between the previous and the current orientations.
Going to look into the Euro One smoothing also as that looks interesting.

Thx Diego for your rig fbx file…had the rotations wrong and also scaling is an issue i see. managed to pull of this result…need some more tweaking.

vimeo.com/152067251

was wondering if someone already manage to rig a character fully tweaked with most of the kinect channels ?

I built my mesh in blender and included it with the TOE.
Kinect_mesh.zip (379 KB)

Thanks for that Ian! Just saved a whole bunch of time in a proof of concept.

Bruce

Ian, thanks for the tox, mesh, and rigging. It works well for poses, but something seems off about the scale. I think the Kinect CHOP uses meters, but looking at the ty parameters on the nulls in the tox, I think the tox uses feet. This causes a problem if I take a step of about 1 meter. The mesh thinks I’ve moved 1 foot, so not much motion happens at all. Has anyone seen this issue and remedied it? I tried multiplying all the ty numbers by 0.3048 (converting feet to meters), but that bloated the mesh in a Stay Puft Marshmallow Man way :laughing:

Oh my gosh!! Thanks a lot!!!

It helps me a loooottt!!!