Is this assets pipeline plan good enough for JME?

Hello, monkeys, I’m relatively new among you and need a bit of guidance.

I have a plan of assets pipeline for my current project but I’m not sure if it’s compatible enough with JME and that’s where I need your help.

Main idea is to have animation data stored independently from “core” of 3d-model with possibility to load and use animation data only when necessary. Another use-case is to have single set of animations for models with similar enough skeletons - made once and reused automatically without need to apply existing animations to new model in 3d-editor (and I will have a lot of similar model skeletons in my current project).

So, my planned pipeline looks like this:

  • mesh modeling, texturing, skeleton creation (any 3d and 2d tools, in my case it’s Blender and Photoshop), result is “core model”
  • animating: core model is copied, upgraded with IK controls, animated and then baked with IK controls removed (possibly one file per animation or per logical group of animations), result is two files per animation group, one with IK controls and one baked
  • saving and exporting to compatible format (in my case I just plan to use .blend files so this step is trivial)
  • importing to JME: core model, textures and baked animation files are imported into JME and converted, result is files in native JME format (in simple variant mesh and animations could be merged at this point)
  • usage: in simple variant it’s just normal way to deal with assets, in more complex variant mesh and animations should be merged in code before they could be used

Note: only skeleton animation data is loaded and converted from “animation files” while mesh, textures and weight painting are discarded as we should already have them in “core model”.
Another note: not baked copy of animated model with IK controls is saved to be used if there will be need to alter animation without recreating it from scratch.

And now it’s finally time for real questions:

  1. Is JME already capable to deal with separated animations? For instance, is it capable to extract only animations from .blend file and discard everything else? Just to be sure that I’m not reinventing the wheel, again.
  2. Did someone try similar pipeline?
  3. Do you have any ideas or warnings about path I’m about to choose?

My pipeline:
model > texture > animation (+IK etc) > Ogre export > JME SDK import = done
Why you need separate animations?


Right now I just use importer from .blend files with baked animations and it’s enough for my needs but it can’t last forever.

I have many reasons to have separate animations, but here are main ones:

  • Skin system: I pan to have many skins for my characters, almost all having the same animations so it’s natural to share animations instead of storing them in every skin model.
  • Clothing system: I plan to display equipment on characters and in some cases it should share skeleton and animations with character model.
  • Cinematic system: I plan to use hi-poly models with detailed animations for cinematic scenes and it’s terrible to have to load all animations in one model when you need only one animation or two. And having five sets of identical hi-poly models for five different cinematic scenes is not much better.
  • Player created content: I plan to give players possibility to create their own skins for characters and items and having animation-less “core models” makes this task much more simple.

Just to note - not all this will be implemented at once but I prefer to have assets pipeline ready to increasing needs instead of reassembling it every time I’m ready for a new feature.

1 Like

(i know a few improvmenets would be nice, but meanwhile:)
If skin = texture you can just replace the material by code with the current system.
Equipment might be possible to do via attachment nodes.

Animation retargeting isn’t an easy topic. The animations will only work on multiple models if their skeletons are exactly the same. You can retarget certain enimations in blender but then you’d have to export them with each model again anyway. Maybe its wise to first check if what you want to do is really necessary for your application. It might be easier to keep the retargeting in the blender part of the “pipeline”.


Skin may mean anything from mere texture to completely new model on the same skeleton, but in my case it usually will be new model with new texture while keeping all animations the same.

Regarding equipment - yes, attachment nodes are enough for simple items like helmets or weapons and I already started with it, but things suddenly begin to get more complicated when it comes to body armour on animated characters. And my game is not MMO or RTS or even very dynamic one so there is no excuse for “cheap” tricks like just drawing plate armour on texture, adding shoulders, boots and implying that it’s completely new “OMFGSUPERDUPERARMOURSET”.

That’s the thing - my plan IS to have exactly the same skeleton at least for all variations/skins of every single character and, when possible, to have similar enough skeletons for similar characters - there is no point in having different skeletons for every human character in a game.

I already mentioned my reasons to have separated animation - skins, clothing, hi-poly cinematic and player-created content.

Right now I don’t have any of this “complicated” things ready so I don’t need separated animations as well but once I’m done with game logic, multiplayer, all main characters and storyline, there will be time for visuals and I’d like to have solid pipeline to that time.

Only thing I agree to drop from my list is hi-poly cinematic - I already have ideas how to replace it with less effort and maybe with even better visual quality.

And just to clarify things a bit - I’m ready to rewrite animation and skeleton controllers and any other part of core necessary to achieve what I need.

I would to support this kind of pipeline with xbuf. I already start to study how to bind/retarget animation (eg from bvh).

xbuf is an open source format (WIP) and a set of tools (WIP also).

So maybe some kind of collaboration would be possible ?


Sounds good, but don’t expect me to jump into a train right now - there are more important things to be done first. I started this thread long before I’ll really need anything from it to have enough time to gather more information and discuss things.

Well, then you are especially lucky because xbuf is already getting pretty good and will only continue to get better while you wait.

It’s definitely what you should check out first when you are ready to look for real.

You know, this thread name should be renamed to “animation re-targeting”, its an very nice feature, and I am glad xbuf will support it.
I made an solution of this a long time ago, but my solution required the number of vertices of the models to be the same.
@david_bernard_31 , will xbuf solution support complete different models (different mesh , number of vertices) ?

I would like, but today I can’t say the future. I only start to explore re-targeting (same skeleton, similar skeleton, …).

Discussion is a way to jump into the train, or at least to help draw the rails.

By jumping into the train I mean starting to work with code. I’d like to to be helpful but right now all my “coding power” is consumed by different tasks like decoupling of game logic from display and writing game logic itself. Meanwhile I will research this theme deeper and stay here ready for discussion.

You should look into this then GitHub - Nehon/bvhretarget: test project for bvh loading and retargeting to arbitrary skeleton

Thanks @nehon, I already look at it (I did some search on the forum also :stuck_out_tongue_winking_eye: ). That could be a solution for @prog. For xbuf, I don’t know yet how I would like to support re-targeting (without being too dependent of the engine). Eg xbuf try inforce “convention over configuration” so define a “pivot/normalized” skeleton + tools that convert bone name,… to the pivot skeleton could be a solution (for human).

The doc of UE4 on re-targeting is interesting also (I learnt UE4 only to play with it), and the way mixamo define key point of skeleton is also very interesting.

Without forgot the paper/video about procedural animation (IK,…) see the youtube channel of aigamedev.

Should it be possible to do this with code on the current engine ?
I mean, you have the objects and controls in the spatial, is not it possible to take and copy the internal animation control and stuff and past inside another spatial ? I mean, copy the bones and stuff, associate with the new vertex etc. If not, it would be an good add to the engine as well, and I would guess its not so hard to implement…

…my pipeline is like this… 3DSMAX->FBX->BLENDER->JME3 …in between added tons of cursing :slight_smile:
…also, for folks who is using same pipeline as i do, read this because it will save you from nervous breakdown i have had… :smile: