I spent many hours exploring getting BVH Motion Capture data loaded onto a custom character model and rig in Blender, and then loaded into JME, and I failed, miserably =) So I’m making this post to share some of my experiences, and the resources I found useful, in the hopes it can save others research time, and hopefully someone will (or already has) figured this out and can tell me where I went wrong.
BVH Data
tl;dr : use cgspeed BVH data.
I found loads of free BVH data around the web, cgspeed had the best list, 2548 motions converted from the Carnegie-Mellon Graphics Lab Motion Capture Database. This appeared to be the most consistent data for naming and armature layout, I found around sets 1-8 worked best for me, after that, at some point, they changed to a different armature layout that I struggled to get working.
TrueBones have pack of 500 free BVH files, fill in their contact form and they will send you a link. It took a day for me to get the link, at which point I had stopped working on this so I can’t say if this data is any good.
I liked the look of the 3 BVH data packs from http://accad.osu.edu/ , the data was consistent and clean, however there was some odd 90° rotations applied to the root poses, that I couldn’t be bothered figuring out how to work around.
Retargetting in Blender
tl;dr Blender MoCap Tools kind of work, MakeWalk plugin did work, but the JME friendly export file size was way to large.
I was using Blender 2.68.
Blender Motion Capture Tools
When loading BVH data into Blender, it loads as a new armature, so you need to map that armature onto your custom characters armature, so when the motion captured arm moves, your characters arm moves, this a process called retargetting. A chap named Benjy Cook wrote the Blender Motion Capture Tools addon which is used for retargetting, he wrote a manual and made a useful series of tutorial videos , starting with Blender Motion Capture Addon - Tutorial 1 - Basic Retargeting .
I had limited success with this, after a lot of screwing around I with a very short and stubby character rig I was using, I switched to using a very standard humanoid rig which I had more success with. I found using an armature that is does not match the proportions of a normal adult human made the process more complicated (even though in theory this shouldn’t really matter). I abandoned this method after I could not get around a “bone twisting” issues, the fix Benjy talks about didn’t work for me, and my legs were twisted all the way around, facing the wrong way.
MakeWalk Plugin
Next I tried MakeWalk - three-click mocap retargeting, which is designed to work with Make Human, but there is also a stand alone blender plug-in (at the time of writing, v0932 was is the most recent, listed half way down the page some where).
After loads of fiddling I did get this working. I needed my armature to be based on the Blender Human Meta-Rig, which I then Rigifyed (Basically a Blender add on that adds load of useful biped rig controls).
In the MakeWalk settings I turned off automatic detection for both source and target, target I set to ‘Rigify’, source I set to ‘ACCAD’, then used motion 06_07 from the cgspeed BVH files . After retargetting, I converted my key frame animation to an NLA strip, exported my mesh and rigify armature with Ogre3D exporter, moved the Mesh.mesh.xml and Mesh.skeleton.xml to my assets folder in JME and loaded the model and animation in. In JME, my model was all wonky as if the location and/or rotation were not cleared before exporting, or the Z rotation was off by 90° you know, the standard issues, I also didn’t use a root bone so it may have been that. I imagine this should be easily fixable, I’ve run into these kinds of issues may times.
At this point I abandoned this method, since the skeleton file was 49.5MB for only a simple, few seconds of animation. This may have been because it exported all the additional rigify bones, or because there are no keyframes using this method, every frame is a key frame for every bone. I didn’t feel like figuring out exactly why.
Finally
For me, motion capture data sounded like an easy way to get loads of HQ free animation for my models, but unfortunately, at this point in time (and with my lack of Blender skills), that is not the case. While I did kind of get this working, the final outcome was just not game dev suitable. For now I will be resorting back to hand done keyframe animation.Here is a picture of a monkey:
Cheers
James