Blender Ogre export - No bump map within jme3 scene

Hi,

when i create a model in Blender 2.64 with a diffuse texture uv and normalmap i can export it using the orgrexml 0.5.8 plugin (from JME3 RC2) very easy. It returns a scene, mesh.xml, material and the image files (Normalmap and the uv-diffusemap). It seems to be fine. But when i load the *.scene file in jme3 the normal maps are gone and i dont have any bump mappings. I read something about TangentBinormalGenerator.generate(…); but it does not work. Another advice is to write an own *.j3m but i wonder if there is an easier way to obtain working Normalmaps. I think it will be not so esay to write an own material definition?!

Now i wonder if you someone can tell me the easiest solution for my problem.

BTW: When i create the “bumpy sphere” - programatically - bump mapping works fine…
Maybe there is a way to get the get the Material-object from the loaded spatial and add the NormalMap?

thanks
daniel

Not sure about why the normal maps aren’t being attached but concerning:

Maybe there is a way to get the get the Material-object from the loaded spatial and add the NormalMap?

Navigate to the geometry of the imported model, then do
[java]geometry.getMaterial ();[/java]

Then you can set the normal map on that.

Wow! that’s a fast reply! =) Yes i already tried to get the geometry but how can i get it from a Spatial?

Geometry is a Spatial :>

In the scene composer, you can see your structure of your model. Most probably there’s a Node at the top, and maybe further Nodes, find the geometry and then set its name. Then cast your model to a Node, and then do [java]Geometry g = (Geometry) node.getChild (“geometryName”);[/java] but theres other ways as well

Geometry extends a Spatial and the “getMaterial”-Method is implemented in Geometry but not in Spatial. I tried to upcast my Spatial but i got a ClassCastException. I load my scene with this command:

assetManager.loadModel("Models/ogre/first.scene")

This returns a Spatial and i can not get the material because it has no getMaterial-method. Any other ideas?

----EDIT:
Sorry read your last post now! I give it a try. Thanks! =)

that works! but the normal map looks disgusting! it would be the best if it could work directly from the ogre export. if you have another advise let me know please. maybe there is something wrong within blender … i dont know. but thank you for your help! =)

Maybe someone knows a good example how to load blender objects into jme3 that contains normal maps?! Or if someone has a working ogrexml export of a blender scene he could give it to me so i can test it by my own - then i know there is something wrong with my scene model. I am trying it since several weeks and never got it working.

https://wiki.jmonkeyengine.org/legacy/doku.php/jme3:external:blender

I read this page again and again and tried creating and loading a scene from the very beginning (blender -> create new plane as a floor then mapping it with a diffuse and normal map as a uv map, loading it into jme3 via BlenderModelLoader, ogrexml, *.obj) -> It won’t work.

Sometimes (sometimes the model is shadeless black with another blender scene) it works when i use the BlenderModelLoader to load the blend file. But The BlenderModelLoader loads the textures of the scene very pixelated and the normalmap Mapping size (Making the normalmap smaller to repeat it, to make it more detailed) of x: 15 y:15 and z:15 return to x: 1 y:1 and z:1. The normal map is streched across the whole plane/floor. Importing a ogrexml export looks fine. But the normal map is completely missing although iam doing everything right that is explained here: https://wiki.jmonkeyengine.org/legacy/doku.php/jme3:external:blender#creating_models

daniel

I usually export the diffuse/normal/specular maps as images from blender. Then import the mesh and create a j3o-file of the mesh in the SDK. Then I create a material in jME SDK and load the image maps and assign the material to the model. In this way I can tweak the material in the SDK - because it will look differently than in Blender.

I managed this just on yesterday!
After long an unfruitful trials, it seems that the best thing to do is to use the blender importer. Just import your blend file into JMonkey platform and it works…

However, please note that your normal map must be saved to an external file. In all my experiments, packed png files gave distorted normal maps withing JME.

I think that you can reach the same result if you use the ogre XML exporter, then build a new J3M file, adding the flipped normal map manually into it. Again, you need the map saved to a picture on the disk…

Yes it works fine with the blender importer! Thank you so much! =) But if i import it with the blender importer, the textures are pixelated and not smooth. What can i do?

I made a screenshot of what i mean:

In the above example i just do this in jme3:

assetManager.registerLoader(BlenderModelLoader.class, "blend");
Spatial tunnel =  assetManager.loadModel("Models/tunnellowblend/tunnellow.blend");
TangentBinormalGenerator.generate(tunnel);
rootNode.attachChild(tunnel);

and the result is the pixelated first image. But when i do this:

assetManager.registerLoader(BlenderModelLoader.class, "blend");
Spatial tunnel =  assetManager.loadModel("Models/tunnellowblend/tunnellow.blend");
TangentBinormalGenerator.generate(tunnel);
Material mat = new Material(assetManager, "Common/MatDefs/Light/Lighting.j3md");
mat.setTexture("DiffuseMap", assetManager.loadTexture("Models/tunnellowblend/tunnellowdifuv.png"));
mat.setTexture("NormalMap", assetManager.loadTexture("Models/tunnellowblend/tunnellownormaluv2.png"));
mat.setBoolean("UseMaterialColors", true);
mat.setColor("Specular", ColorRGBA.White);
mat.setColor("Diffuse", ColorRGBA.White);
mat.setFloat("Shininess", 28f);        
tunnel.setMaterial(mat);
rootNode.attachChild(tunnel);

…everything looks fine as in the bottom picture. Is there a way to make the pixelated textures smooth as in the bottom picture without creating and setting the material during runtime?

Why can’t you create j3o and j3m files in the SDK and use those?

Yes that is good when i have one smaller object with just one diffuse, specular and normal UV map. But what if i create a big level with several textures (because a uv map would be too large). I think that could fast become very complicated isn’t it? correct me when iam wrong pls. Iam very new to jme3 and game development.

Yes that is good when i have one smaller object with just one diffuse, specular and normal UV map. But what if i create a big level with several textures (because a uv map would be too large). I think that could fast become very complicated isn’t it? correct me when iam wrong pls. Iam very new to jme3 and game development.
You should use maps / atlases also for large scenes as much as possible. Think more like having the atlas before and then moving the texture coords so that the walls show wall texture, the roads road texture etc. Else you end up with many single objects which drives down performance for no good reason.

okay… do i understand it right:

  1. it is possible to use multiple uv maps/atlases on one big mesh. (So it is possible to have higher texture details in the game)
  2. The benefit is that i have fewer meshes in my scene which increases performance.
  3. Loading one atlas (one image) is faster than loading much images. That increases performance too and that’s why i should put as much images as possible on one texture atlas (instead of single images) and map them with uv.
  4. Currently the best way is to convert my *.blend object into *.j3o and create a *.j3m material for every mesh (because loading a *.blend with BlenderModelLoader ends up with very pixelated textures). In this material i specify every uv maps (diffuse, specular, normal) and it’s uv coords that are applied to this mesh.
  5. … In this *.j3m i can specifiy several uv-maps (a few diffuse, specular and normal maps)

I hope my questions aren’t too basic, but i really didn’t found anything about this kind of basic knowledge. So thanks for your help!

  1. No, each texture means one object. If its one object in blender it will be split.
  2. Yes
  3. Yes but primarily 2)
  4. It depends, in the end the added control through the j3m is welcome anyway, you cannot be sure that material settings get through when going via multiple external formats.
  5. No, you can just assign the diffuse, specular and normal map for this geometry using this material. Check out the hello tutorials to get the jme terms and functions straight.

okay, jmonkey splits meshes with multiple textures in different meshes. You are right i read it here (https://wiki.jmonkeyengine.org/legacy/doku.php/jme3:intermediate:multi-media_asset_pipeline ):

Create 3D Models

Install a mesh editor such as Blender or 3D Studio MAX. Reuse textures and materials as much as possible. Consult the mesh editor’s documentation for specific details how to do the following tasks.

  1. Create 3D models in a mesh editor.
  2. Create simple low-polygon models. High-polygon models may look pretty in static 3D art contests, but they unnecessarily slow down dynamic games.
  3. Create materials for your models either in the 3D editor, or in the jME3 SDK. Only use Diffuse Map (minimum), Normal Map, Glow Map, and Specular Map.
  4. Every material feature not listed in the Materials Overview is unsupported and ignored by JME3.
  5. Unwrap the model in the 3D editor and generate a UV texture (i.e. one texture file that contains all the pieces of one model from different angles).
  6. Don’t use multiple separate texture files with one model, it will break the model into several meshes.
  7. Export the model mesh in one of the following formats: .blend, Wavefront .OBJ/.MTL, Ogre .mesh/.material/.scene.
  8. Bake each texture into one file when exporting. (Create a Texture Atlas.)
  9. Save exported models to subfolders of the assets/Textures directory, together with their Textures. (for now)

Point 6:
I have problems to understand how i can create a big level with high detailed textures. Because it is not state of the art to create a 9000x9000 pixels uv map for example. Or do i reuse parts of the uv map on my mesh so that different parts of my mesh point to the same coordinate on my uv map?