Creating assets in Blender3D

This section discusses how to create and import models from Blender3D (2.62+, see bottom of page for Blender 2.49 and before) to jME3. Furthermore it explains how you can create various typical game-related assets like normal maps of high-poly models and baked lighting maps.

Asset Management

For the managing of assets in general, be sure to read the Asset Pipeline Documentation. It contains vital information on how to manage your asset files.

Creating Models

Game-compatible models are models that basically only consist of a mesh and UV-mapped textures, in some cases animations. All other material parameters or effects (like particles etc.) can not be expected to be transferred properly and probably would not translate to live rendering very well anyway.

UV Mapped Textures

To successfully import a texture, the texture has to be UV-mapped to the model. Heres how to assign diffuse, normal and specular maps:

Its important to note that each used texture will create one separate geometry. So its best to either combine the UV maps or use a premade atlas with different texture types from the start and then map the uv coords of the models to the atlas instead of painting on the texture. This works best for large models like cities and space ships.

Animations

Animations for jME3 have to be bone animations, simple object movement is supported by the blender importer, mesh deformation or other forms of animation are not supported.

To create an animation from scratch do the following:

  • Create the model
    • Make sure your models location, rotation and scale is applied and zero / one (see "Model Checklist" below)
    • (Did you know? You can make any model from a box by only using extrusion, this creates very clean meshes)
  • Create the armature bones, don't forget to have one root bone!
    • Start by placing the cursor at zero
    • Go to the Add→Armature→Single Bone menu and create the root bone
    • Select the bone and go to edit mode (press tab)
    • Select the root bone end and press "E" to extrude the bone, then start rigging your model this way
    • Make sure your armatures location, rotation and scale is applied (see "Model Checklist" below) before continuing
  • Make the armature the parent of the model
    • Make sure you are back in object mode (press tab again)
    • First select the model object then select the armature object with shift pressed, then press Ctrl-P
    • When you do this, you can select how the bone groups will be mapped to the model vertices initially
  • Set a new armature constraint in the model "Object Modifiers" settings and select the Armature
  • Voila, your model should move when you move the bones in pose mode
  • Go to the "Dope Sheet" window and select the "Action editor"
  • Add an action by pressing the plus button
  • Create the keyframes (select the model armature and press I) along the timeline
  • Each action will be an animation available via the animation control in jME after the import
  • Press the "F" button next to the action so it will be saved even if theres no references
    • The animation would else be deleted if its not the active animation on the armature and the file is saved

Model Checklist

Sometimes you do not create the model yourself and often times models from the web are not really made for OpenGL live rendering or not completely compatible with the bone system in jME.

To export an animated model in Blender make sure the following conditions are met:

  • The animation has to be a bone animation
  • Apply Location, Rotation and Scale to the mesh on Blender: On 3D View editor on Blender, select the mesh in Object Mode and go to the 3D View Editor’s header → Object Menu → Apply → Location / Rotation / Scale.
  • Apply Location, Rotation and Scale to the armature on Blender: On 3D View editor on Blender, select the armature in Object Mode and go to the 3D View Editor’s header → Object Menu → Apply → Location / Rotation / Scale.
  • Set the mesh’s origin point in the bottom of the mesh (see the image below).
  • Set the armature’s origin point in the bottom of the armature (see the image below).
  • Armature’s origin point and mesh’s origin point must be in the same location(see the image below).
  • Use a root bone located in the armature’s origin. This root bone must be in vertical position (see the image below) and it is the root bone of the armature. If you rotate the root bone, the the entire armature might be rotate when you import the model into jMonkey (I’m just mentioning the result, I don’t know where is the problem (jMonkey importer or blender’s ogre exporter plugin)).
  • Uncheck “Bone Envelopes” checkbox on the Armature modifier for the mesh (see the image below).
  • Uncheck “Envelopes” checkbox on the armature (see the image below).

You can use SkeletonDebugger to show the skeleton on your game in order to check if the mesh and the skeleton are loaded correctly:

    final Material soldier2Mat = assetManager.loadMaterial("Materials/soldier2/soldier2.j3m");
    final Spatial soldier2 = assetManager.loadModel("Models/soldier2/soldier2.j3o");
    TangentBinormalGenerator.generate(soldier2);
    soldier2.setMaterial(soldier2Mat);
 
    final Node soldier2Node = new Node("Soldier2 Node");
 
    soldier2Node.attachChild(soldier2);
    rootNode.attachChild(soldier2Node);
 
    final AnimControl control = soldier2.getControl(AnimControl.class);
    control.addListener(this);
    final AnimChanel channel = control.createChannel();
 
    final SkeletonDebugger skeletonDebug = new SkeletonDebugger("skeleton", control.getSkeleton());
    final Material mat = new Material(assetManager, "Common/MatDefs/Misc/Unshaded.j3md");
    mat.setColor("Color", ColorRGBA.Green);
    mat.getAdditionalRenderState().setDepthTest(false);
    skeletonDebug.setMaterial(mat);
    soldier2Node.attachChild(skeletonDebug);

Also check out these videos and resources:

NormalMap baking

Models for live rendering should have a low polygon count. To increase the perceived detail of a model normal maps are commonly used in games. This tutorial will show how to create a normalmap from a highpoly version of your model that you can apply to a lowpoly version of the model in your game.

Blender modeling lowPoly & highPoly

  • If you use the multiresolution modifier you only need one object. Lets look at this example:
  • Add a multiresolution modifier:
  • There are two types of modifiers: Catmull-Clark and Simple.
    • Simple is better for things like walls or floors.
    • Catmull-Clark is better for objects like spheres.
  • When using Catmull-Clark with a higher "subdivide" value (more than 3) its good to have the "preview" value above 0 and less than the subdivide level. This is because Catmull-Clark smoothes the vertices, so the normalMap is not so precise.
  • Here is an example of Prewiew 1, it's more smooth than the original mesh:
  • Enable "Sculpt Mode" in blender and design the highPoly version of your model like here:
  • Now go into Render Tab, and bake a normalMap using same configuration as here:

Remember! The actual preview affects the baking output and mesh export!

Be careful: The steps above lead to terrible normal maps - use this procedure instead:

  • uncheck "[ ] Bake from Multires"
  • switch to object mode
  • make a copy of your mesh (SHIFT+D)
  • remove the Multires modifier from the copied model
  • remove any materials from the copied model
  • remove the armature modifier from the copied model
  • select the original (highres) model
  • go into pose mode, clear any pose transformations
  • the highres and lowres models should be on top of each other now
  • select the original (highres) model
  • hold SHIFT and select the copied (lowres) model
  • in the properties menu go to render
  • use Bake > Normal
  • check "[x] Selected to Active"
  • use a reasonably high value for "Margin" (4+ pixels at least for 1024x1024 maps)
  • don't forget to safe the normal map image

Be careful: in the Outliner the camera symbol (Restrict Render) must be on!

Fixing the normal colors in Blender

Blender has its own normal colors standard. We need to fix the colors to prepare the normalmap for using it with the JME Lighting Material.

To do this, go to the Blender Node Window

  • Here is Blender Node example. It fixes the normal colors:
  • Here is the colors configuration:
  • Sometimes it will be needed to change R and G scale and add some blur for better effect. Do it like on image below
  • After rendering, save the file to a destination you want and use it with the JME Lighting Material and the lowpoly version of the model.

LightMap baking

The goal of this tutorial is to explain briefly how to bake light map in blender with a separate set of texture coordinates and then export a model using this map in jME3.

Blender modeling + texturing

  • create a mesh in blender and unwrap it to create uvs
    • 1.jpg
  • In the mesh tab you can see the sets of Uvs, it will create the first one.
    • You can assign w/e texture on it, i used the built in checker of blender for the example.
  • In this list, create a new one and click on the camera icon so that baking is made with this set. Name it LightUvMap.
  • In the 3D view in edit mode select all your mesh vertice and hit 'U'/LightMap pack then ok it will unfold the mesh for light map.
  • Create a new image, go to the render tab an all at the end check the "Bake" section and select shadows. Then click bake.
  • If all went ok it will create a light map like this.
    • 2.jpg
  • Go to the material tab, create a new one for your model and go to the Texture Tab.
  • Create 2 textures one for the color map, and one for the light map.
  • In the Mapping section be sure to select coordinates : UV and select the good set of coordinates.
    • 3.jpg
  • Then the light map
    • 4.jpg

Importing the model in the SDK and creating the appropriate material

Once this is done, export your model with the ogre exporter (or import it directly via the blend importer), and turn it into J3o with the SDK.

  • Create material for it using the lighting definition.
  • Add the colorMap in the diffuse map slot and the lightMap in the light map slot.
  • Make sure you check "SeparateTexCoords"
    • 5.jpg
  • It should roughly result in something like that :
    • 6.jpg

The blend file, the ogre xml files and the textures can be found in the download section of the google code repo

http://code.google.com/p/jmonkeyengine/downloads/detail?name=LightMap.zip&can=2&q=#makechanges

Modelling racing tracks and cars

Follow the link below to a pdf tutorial by rhymez where I guide you to modelling a car and importing it to the jMonkeyengine correctly and edit it in the vehicle editor.Plus how to model a simple racing track. http://www.indiedb.com/games/street-rally-3d/downloads/modelling-in-blender-to-the-jmonkeyengine

Optimizing Models for 3D games

Follow the link below to a pdf tutorial by rhymez where I guide you on how you can optimize your models for faster rendering. http://www.indiedb.com/games/street-rally-3d/downloads/optimizing-3d-models-for-games

SkyBox baking

There are several ways to create static images to use for a sky in your game. This will describe the concepts used in blender and create an ugly sky :-) Check the links below for other ways and prettier skies.

A sky box is a texture mapped cube, it can also, loosely, be called en EnvMap or a CubeMap. The camera is inside the cube and the clever thing that jME does is to draw the sky so it is always behind whatever else is in your scene. Imagine the monkey is the camera in the picture.

But a real sky is not a box around our heads, it is more like a sphere. So if we put any old image in the sky it will look strange and might even look like a box. This is not what we want. The trick is to distort the image so that it will look like a sphere even if it in fact is a picture pasted on a box. Luckily blender can do that tricky distortion for us.

The screenshots are from Blender 2.63 but the equivalent operations have been in blender for years so with minor tweaks should work for almost any version.

So let's get started

  • Fire up blender and you'll see something like this.
  • The cube in the start scene is perfect for us. What we'll do is have Blender render the scene onto that cube. The resulting image is what we'll use for our sky box. So our jME sky will look like we stood inside the blender box and looked out on the scene in blender.
  • Start by selecting the box and set its material to shadeless.
  • Now we will create a texture for the box. Make sure the texture is an Environment Map, that the Viewpoint Object is set to the cube. The resolution is how large the resulting image will be. More pixels makes the sky look better but comes at the cost of texture memory. You'll have to trim the resolution to what works in your application.
  • Next up is the fun part, create the sky scene in blender. You can do whatever fits your application, include models for a city landscape, set up a texture mapped sphere in blender with a nice photographed sky, whatever you can think will make a good sky. I am not so creative so I created this scene:
  • Now render the scene (press F12). It doesn't actually matter where the camera is in blender but you might see something similar to this:
  • You can see that Blender has actually drawn the scene onto the cube. This is exactly what we want. Now to save the image.
  • Select the texture of the cube and select save environment map.
  • That is it for Blender. Open the saved image in some image editor (I use the Gimp from http://www.gimp.org here).

The SDK also contains an image editor, right-click the image and select "edit image" to open it.

  • You will notice that Blender has taken the 6 sides of the cube and pasted together into one image (3x2). So now we need to cut it up again into 6 separate images. In gimp I usually set the guides to where I want to cut and then go into Filters→Web→Slice and let gimp cut it up for me.
  • Next up is to move the image files into your assets directory and create the sky in jME. You can do that in the Scene Composer by right clicking the scene node, select Add Spatial and then select Skybox.

If you want to do it from code, here is an example:

public void simpleInitApp() {
 
    Texture westTex = assetManager.loadTexture("Textures/west.png");
    Texture eastTex = assetManager.loadTexture("Textures/east.png");
    Texture northTex = assetManager.loadTexture("Textures/north.png");
    Texture southTex = assetManager.loadTexture("Textures/south.png");
    Texture upTex = assetManager.loadTexture("Textures/top.png");
    Texture downTex = assetManager.loadTexture("Textures/bottom.png");
 
    final Vector3f normalScale = new Vector3f(-1, 1, 1);
    Spatial skySpatial = SkyFactory.createSky(
                        assetManager,
                        westTex,
                        eastTex,
                        northTex,
                        southTex,
                        upTex,
                        downTex,
                        normalScale);
    rootNode.attachChild(skySpatial);
}

This example uses a strange normalScale, this is to flip the image on the X-axis and might not be needed in your case. Hint: the texture is applied on the outside of the cube but we are inside so what do we see?

Further reading

 
 
Except where otherwise noted, content on this wiki is licensed under the following license:CC Attribution 3.0 Unported