Blender import in JME: best practices?

Hello,

I'm currently wondering what are the best practices to import something (scene, animation, object, bone…) made in Blender into JME.



I've been reading some posts about Collada vs MD5 vs deprecated-blender-XML-import. I've also seen attempts (more or less finalized) to export from Blender using Python.



I've got two questions:



-1- are there any exhaustive up to date best practices available somewhere (Wiki?)?



-2- would it be feasible to directly read .blend file using Java and interpret it into the JME world?



Thanks for your help.



Niggle

Niggle said:

-1- are there any exhaustive up to date best practices available somewhere (Wiki?)?

-2- would it be feasible to directly read .blend file using Java and interpret it into the JME world?


-1- After the experience we are accumulating here I think that soon it will be the right time to write a sort of guide about model preparing for exporting and exporting from Blender. While you wait for it, in the jME wiki, there is a tutorial that summarize the steps to export from Blender to MD5 Reader 2 or kman's MD5 Loader, written by mud2005.

-2- *.blend files are saved in binary format. There is no technical obstacle to read them in Java. Though Blender format is a really complex and most of its features are not usefull for a Game format. So, I think it can be a waste of time and disk space to adopt Blender format as the standard game models format.

I've been learning .blend file format this week and yes it is binary, somewhat complex (reverse engineering even with C source code) but not  impossible.



I'm currently searching object hierarchy to use for JME:

  • Scene

        - World

        - Objects

              - Meshes

                  …

        - Cameras

        - Lamps

        …



    and I should begin next week instanciating JME objects.


Nothing is impossible.



The point is that there are some features of *.blend files that make it not the best possible format for a game:


  1. complexity
  2. huge weight that *.blend files can reach
  3. a lot of superfluous informations for a game engine

          3.1) informations not supported by jME (like n-gons, b-bones, several modifiers, bones constraints, python support, game engine data, …)

IMO you shouldn't do this unless you have a very good reason.


  • You can easily write an exporter for blender with python and an importer for jME.

  • By making it only load .blend files you restrict your application by requiring the models to be created in Blender3D

  • Like Ender pointed out, you will have a lot of useless stuff in the file that's only useful for Blender itself.

  • The importer will end up having many unsupported features and so there will be bugs when try to load any model that the importer was not made to load. (See Collada format as an example of this issue)

Ender>

1/ I don't intend to use this format to hold game data, just to import data then save in a more JME-friendly format (native JME…)

2/ superfluous information not usable are simply bypassed



Momoko_Fan> I think I have :slight_smile:

1/ I don't know Python, I want to learn JME

2/ no problem with that, I think Blender3D is a sufficient good tool for my needs (create models + materials)

3/ I know that but the .blend format is not as hard as it is at first sight and the Data Block are clearly identified SC (scene), OB (object), ME (mesh)…

4/ I would be interested if you could detail more this point. I guess, I'm not trying to make a .blend file enter completly into JME, just the parts I need. Documentation may help in this case. By the way, if one day I feel to implement more-features importation, it would be feasible

hey there,



I have tried several format/3d software combinations and the only thing i didn't have problems with was Wavefront .OBJ out of 3D Studio Max. Some softwares dont produce a material library for the .OBJ file … a.s.o. and the only always working IMPORT into 3D Studio Max was VRML 1/2.



So: what ever software you use, if it exports to clean VRML then load it in 3DS Max (if you have it) and export it from there to Wavefront .OBJ, for static objects (scene elements) that is.



hope i helped,

Andi.

You guys should have a look at the new XML IO system. http://www.jmonkeyengine.com/jmeforum/index.php?topic=7034

I am currently working on a Mesh.XML (ogre3d) importer for use in Radakan since jME lacks support for importing models with animations…

Because the ogre3d exporters for modeling tools are well developed this will provide a solid art path for jME.

:slight_smile: I may have a look at this new XML IO system.



By the way, I've successfully imported a PointLight from a .blend file into JME :smiley: however, I'm struggling

to match the JME Light attributes with the numerous Blender Lamp ones.



1/ For example, quadratic attenuation is directed by two values Quad1 and Quad2.



Does someone have some good-up-to-date information about these JME attenuation attributes (including constant, linear and quadratic values)? I've found one but it is quite obscur (wiki page not being up to date?).



2/ For JME I can define for a Light ambient color, diffuse color and specular color. I currently can find only ambient color in Blender. Am I missing something?



3/ I've found this page on http://wiki.blender.org/index.php/Manual/Lamp blender lighting being very helpful, is there something similar for JME?



Thanks for your answers.


Oups sorry, I've been lazy this time, or forgot open source!!!



Blender doing Open GL as well as LWJGL it is easy:


Ender said:


-1- After the experience we are accumulating here I think that soon it will be the right time to write a sort of guide about model preparing for exporting and exporting from Blender. While you wait for it, in the jME wiki, there is a tutorial that summarize the steps to export from Blender to MD5 Reader 2 or kman's MD5 Loader, written by mud2005.
...


Hey Ender.

Any ETA on your upcoming guide... days or months?  I need to dive into this, but I'm really wary of following an old tutorial, considering all the posts I see about updates to the jME importers/exporters.

I hope I’m wrong, but the /wiki/doku.php?id=exporting_animated_.md5_models_from_blender tutorial looks like it’s transferring keyframe animation instead of skeletal (I have read the tutorial, but don’t want to spend time executing it if the goal is keyframe animation).  I had to write my own skeletal animation system for Java3D, where I used my own graphical tool to add joint vertexes to .obj models, and used my own algorithms to generate seams to join up the skin from the moving body parts.  I was maintly attracted to jME because it looked like the skeletal animation system can achieve the same goals by using information supplied by the modelling tool.  Am I going to be able to accomplish this with tool chain of Blender -> md5 -> com.jme.animation?

MD5 doesn't support vertex animation, only skeletal. Currently jME's animation system (com.jme.animation) has a few issues so most MD5 loaders for jME use their own classes to animate the models, which means the models won't be compatible with regular jME applications which do not use those libraries. Either way, if you want keyframe/vertex animation you need to use MD2/MD3 or another format, for skeletal animation there are MD5 and a few others.

blaine, there are multiple MD5 loaders for jME available, all of them support keyframed skeletal animation (yet none does so by utilizing jME's builtin skeletal animation code).

Maybe you are confusing "keyframed" with "vertex morphing"? At least that's what your last post sounds like to me.

hevee said:

blaine, there are multiple MD5 loaders for jME available, all of them support keyframed skeletal animation (yet none does so by utilizing jME's builtin skeletal animation code).
Maybe you are confusing "keyframed" with "vertex morphing"? At least that's what your last post sounds like to me.


Well, in the years I've been doing it, there is no consistent name for my goal here... what I have done in Java3D, with great difficulty and a lingering issue... but it's called articulated animation by Andrew Davison and skeletal animation by some others.  It seems that I incorrectly assumed that this was the goal of the com.jme.animation system (and if I was right about that, then I understandably assumed this is what the importers accommodate) :(.

Unless there is some 3D keyframe tactic of which I'm unaware (that would be great!), for keyframe animation I would need to export all the geometry for each key frame (and the runtime system would interpolate).  That doesn't scale well enough for my needs.  Besides the disk space and memory for thousands of key frames, the characters I'm animating are dynamically morphed, so I don't even have the geometry at build time.  Performance would not be acceptable if I had to regenerate hundreds of key frames at run time.

I know what vertex morphing is, and I use that to generate continuous modifications of the characters which I animate.  But whereas my articulated animation system handles morphed characters, it does not use morphing to perform the animation.  It uses forward kinematics to programmatically perform very efficient and scalable skeletal animation.  Is this the goal of com.jmonkey.animation which the importers do not support?  If so, I could set to work to import the joint vertexes to get it working.

@blaine

About Blender to MD5 tutorial: I think I will do it soon (about 2 months).



About skeletal VS keyframe animation

I think you are using wrong words. Though, to be clear as much as possible, MD5 format is based on skeleton animation, weatherer importer you use, of the 3 currently working. Then, the JME wiki tutorial, I suggested you to read, is still a good tutorial.

Skeleton animations are saved as frames, that store the transformations (translation and rotation) for each bone (= joint) of the skeleton. You can call those frames "keyframes", because all the available MD5 importers use an interpolation algorithm to achieve smooth skeleton animation through varying FPS.

Vertexes of the model are skinned to skeleton and deformed depending on skeleton transformations and vertexes weights (= each vertex can be influenced by several bones; each bone influences a vertex by a different percentage).

Though MD5 format is software computed format (= use CPU). To use MD5 data in a GPU based skinning (= vertex shader) you can still use the importers to load the models, but you have to convert this data in a way suitable to be used by GPU and create your custom skinning and animation system.



About jME bone system and MD5 importers

MD5 importers do not use jME bone system. Most of them implement their own animation/skinning system. Currently neakor's importer is the only one that is planned to switch to jME bone system: you should check the version based on jME 2.0 alpha.

There are historical reasons and thecnical reasons that leaded the developers to implement their own animation/skinning systems: 1) when ChaosDeathFish first developed his MD5 Reader, jME had no bone system, and the same is true for kman's MD5 Loader; 2) there are some problems with jME bone system and MD5 format, that keep us to not implement the importers using jME bone system (this problems are probably solved with version 2.0 of jME).



About vertexes morphing based formats

MD2 and MD3 use vertexes morphing. MD5 does not support it.



Other solutions

Try also kman's Cal3D importer. Cal3D is a C library, specialized into skeleton animation, and has its own format. kman developed an importer and animator to handle Cal3D format in jME.

The kman's Cal3D Loader is quite old, and will probably need some little changes to run on top of jME 1.0 or jME 2.0. But I am sure that this changes are only little adjustments.

YES: http://oraclefun.blogspot.com/



And now, I've finally got a mesh loaded from a .blend file :slight_smile:



Thanks to llama (http://www.jmonkeyengine.com/jmeforum/index.php?topic=7890.0) and JME 2.0.


I guess I had a long fight with normals. I end up by not using at all the normals stored in the .blend file.



What I do is: recompute normals in every cases per vertex !


  • When my model is not auto smooth, I assign the face normal to each vertices (3 for TriMesh or 4 for QuadMesh).



    Note also that for each face, I create distinct vertices; it means that if a vertex belongs to two face, I'll create two vertices with two distinct normals. The GeometryTools seems to be able to reduce the number of vertices/normals of a TriMesh but I did use it for a short time only. And for QuadMesh => nothing :frowning:


  • When my model is auto smooth, I recompute each vertex normal as the normalized sum of the normals of all the faces this vertex belongs to (I used a link posted on the forum by Ender).



    Finally, my models have a good look. And now, materials and textures…