- What the heck does Unity3D do to make a model with too separate UV layouts one for textures and one for shadows?
- can someone explain this to me?
To clarify if Unity3D creates a single mesh with a single material and in jMonkey I need to create 16 meshes with 16 materials to get the same result.
- Wouldn’t the slow down rendering?
“I am posting this at Unity3D FPS Tutorial to jMonkey and Major Material Fail”
No you should just map them right. Afaik light maps use a different set of coords in jme. But diffuse, normal and specular maps use the same set of coords so only if these differ you’d be at a loss. Else just make sure the single objects all use the same material (one with these atlases as reference) then they should be rendered relatively efficiently (you can also work with the BatchNode then). Using light maps instead of dynamic maps speeds up rendering immensely.
@normen I have searched the forms and come across several topics about this is there a particular place you suggest that I look?
I was looking mainly at this post Trouble using BatchNode
About light maps? I don’t get your actual issue I think. The models should all have the correct texture coordinates for their textures in the mesh data so you just need to load them really. If the texture doesn’t show up you can just make a material with this texture here and apply it to all models. The texture coords in their meshes will map to the right areas of the atlas. Since they all will have the same material they will batch just like that (though you dont have to do that, it was just an example why this is made so). Check the wiki/manual for “texture coordinates”, “lighting material” and the likes to get some more info.
The mesh imported with no materials blender couldn't find the correct textures neither did the FBX Converter.
"I will post my findings and solution for feature searches here"
I think the issue can be fixed in blender.
The obj importer might create one material for each sub mesh for some reason
Look how many materials you have in blender, you need only one : A texture for diffuse, and one for light map, each one of them assigned to a different set of coordinates. Optionally you can have a normal map with the same uv set as the diffuse map.
Would you mind uploading the blend file so i take a look at it?
@nehon Well I have uploaded the original .blend this is directly after the conversion.
The original Textures are in tif format and due to my upload size limit you will have to acquire them from Unity3D.
Unity3D FPS Tutorial Page
mhhhh… that’s strange, several part of the mesh have there UV layed in the same uv layout on top of each other…
From what it looks i think unity is capable of generating an atlas with several textures so it does not matter if the uv overlap.
Ogre however splits the mesh in several objects.
Also in the tutorial files there are one material file by material in blender…Are you sure that Unity has one mesh and one material?
In the end…16 objects and 16 materials is not that hi, also it would allow some culling on some parts of the level…
No trace of another uv layout though…i don’t know how they managed to use the light map…
I would redo them in blender. Create a light map pack uv layout, then bake AO and shadows into it
Maybe go back to the start of the conversion process and find tools to inspect the model at each step?
Then you can see where in the chain the extra uv mappings/whatever else disappear and look to see if there are alternate tools for that step which will conserve the information.
If you want to use two separate UVs then you can’t use OBJ as it supports only one set of UVs. You may need to use OgreXML instead.
The jME3 lighting material has a parameter called “SeparateTexCoord”, once enabled, the “LightMap” texture will use the second UV set whereas the others (such as “DiffuseMap”) will use the first UV set.
Sorry all I have found the solution to the issue as stated in the Unity3D FPS Tutorial to jMonkey thread
fbx is one of my favourite parts for useage with jme3 (since it usually is exported with orgremax fine no matter who created it).
So if by any chance you are or know a student, just use the student version of 3dsmax(as this is noncommercial I assume)
The answer is you need Maya or 3Ds Max to convert FBX to a readable format for Blender and jMonkey Thank You all.
I was wondering if there is anyone that has made a model that uses this feature in blender?
If not why not if the feature is supported?
If so where can I see an example?
All textured models use uv maps…? Sinbad etc. all do. To use an atlas for multiple models they have to come from one source… Or you mean the light maps?
Prelighting or Shadow Mapping Light Maps
I found this section of the Blender 2.6 Manual witch implies that the light mapping on separate UV’s can be done with Blender but I haven’t found any thing like a tutorial on it.
Multiple UV Layouts
You are not limited to one UV Layout per mesh. You can have multiple UV layouts for parts of the mesh by creating new UV Textures. The first UV Texture is created for you when you select a face in UV Face Select mode. You can manually create more UV Textures by clicking the New button next to "UV Texture" on the Mesh panel in the Buttons Window, Editing Context) and unwrapping a different part of the mesh. Those faces will then go with that UV Texture, while the previously unwrapped faces will still go with the previous UV Texture. Note that if you unwrap the same face twice or more times (each time to a different UV Texture), the coloring for that face will be the alpha combination of the layers of those UV Textures.
Just export the lightmap and set it in a j3m or in code later.
8O Ouch not what I was wanting to here but Thank You.
8O Ouch not what I was wanting to here but Thank You. :D
Whats the issue with setting it in a j3m?
Here is a quick and dirty explanation on how to do it
create a mesh in blender and unwrap it to create uvs
In the mesh tab you can see the sets of Uvs, it will create the first one. You can assign w/e texture on it, i used the built in checker of blender for the example.
Then in this list, create a new one and click on the camera icon so that baking is made with this set. Name it LightUvMap.
Then in the 3D view in edit mode select all your mesh vertice and hit ‘U’/LightMap pack then ok it will unfold the mesh for light map.
Then create a new image, go to the render tab an all at the end check the “Bake” section and select shadows. Then click bake.
If all went ok it will create a light map like this.
Then go to the material tab, create a new one for your model and go to the Texture Tab.
Create 2 textures one for the color map, and one for the light map
In the Mapping section be sure to select coordinates : UV and select the good set of coordinates.
then the light map
Once this is done, export your model with the ogre exporter, and turn it into J3o with the SDK. create material for it
Use the lighting definition add the colorMap in the diffuse map slot and the lightMap in the light map Slot
Make sure you check “SeparateTexCoords”
It should be what you are looking for
I uploaded the blend file, the ogre xml files and the textures in the download section of the google code repo, it you want to take a look at it
Google Code Archive - Long-term storage for Google Code Project Hosting.
EDIT : made a wiki entry for this : https://wiki.jmonkeyengine.org/legacy/doku.php/jme3:advanced:baking_a_light_map_in_blender_and_importing_it_in_jme_via_ogre