User Bone Transforms


I’m trying to animate a 3d Ogre Model, in this case, the Golem which is part of the JME3 test data package. The difficult part is that the data to animate the model is coming from a skeleton tracker (OpenNI software with the Kinect Depth camera). Unfortunately the data provided is in the format of a position and rotation matrix for every joint, whereas to specify the animation in JME3 (as far as I know) I need to work with the position and rotation of bones.

In order to work out the appropriate rotation for the bones in JME3, I had this idea. For each bone, get the position of the two joints at its ends, and calculate the connecting vector, lets say v. I then need to create a Quaternion which will specify the orientation of v, and pass it to the bone, which should then rotate appropriately. However, I can’t work out how to create the quaternion in question…? Any ideas, or is this a bad approach?



This is animation retargetting, grab a big cup of coffee because you’re gonna need it.

It quite complex and there is a lot of math involved.

I once tried to retarget BVH animations to Sinbad skeleton, but gave up before succeeding. I think it’ doable though, but the source and target skeleton must have a similar structure.

You’re approach is not bad, but you don’t really need that.

In fact jME Bones are more joints. each bone has a transform (translation, rotation, scale). So if your source data are joints with translation and rotation that’s fine.

The first tricky part will be to match the skeletons, Oto’s rig is simple, but a bit crappy. Arms are not connected to the spine, for example. You need to match the bones of your source skeleton to Oto’s skeleton (which source bone to use to match target bones)

This can be the first reason to give up.

The next part is adjusting animation. You have to combine the rotations of the source and target skeleton’s joints. I don’t know the exact formula for this, and all i can say is that once you got it, you have to change all animations rotations and translation for each keyframes.

IMO the best way is to create your own models with the very same skeleton that you used for mocap. So you won’t have those 2 issues.

The very talented people over at Wolfire games have a system where you can not only create and bind a skeleton to a model in real time, but it also lets you apply bvh animations to them, doing all the translations/adjusting internally.

how did they do that…

hehehe, that’s where i got the idea to try it actually, I’m a fan of this blog :smiley:

But…this BVH retargetting is good when you know how the target skeleton is gonna look like. But for a general purpose retargetting it’s a bit difficult because user could want to use it with any skeletons.

So we discussed about it with the team and agreed it was more a specific game feature than a game engine feature…so i dropped it.

Yeah. A BVH importer / sdk plugin could be nice :D. Added it to my TODO :).

I started this a long time ago and i have some code to load a BVH skeleton if you’re interested.

I gave up because I failed to properly re-target a loaded animation to an already loaded skeleton (Sinbad for example)

Actually retargetting animation it a whole area of study and it’s quite complicated (involves a lot of tricky maths)

WOw! Very nice! I plan to create a kinect mocap plugin for the SDK that will allow the user generate bvh files. I know a software around there which does that, but it’s only for windows, and I didn’t get to install it on my windows -.-. Then these generated bvh files will be helpful for the user so he can animate his models without assigning key frames by hand, and without need of FK or IK bones. Loading a bvh file directly in jme could be nice too. Then yes, I accept your code :). But I’ll just work on this after the Game Contest :D. I’ll PM you my email :). Thanks.

mail sent :wink:

1 Like