Lightmaps: multitexturing and texture coordinates trouble


I'm trying to use lightmaps, by loading two objecs with the same geometry, but with different textures each (one of them with the lightmap texture).

I do load both objects, and then I try to copy the texture and texture coordinate information to the first object (using a loop which processes each trimesh contained) this way ('a' is the TriMesh instance):


TextureState tsa = (TextureState)a.getRenderState(RenderState.RS_TEXTURE);

TextureState tsb = (TextureState)b.getRenderState(RenderState.RS_TEXTURE);

FloatBuffer fbb = b.getTextureBuffer();


tsa.setTexture(tsb.getTexture(), 1);



At this point, all seems to work fine. But when the rendering takes place, I got an exception:


at java.nio.Buffer.limit(

at com.jme.renderer.lwjgl.LWJGLRenderer.predrawGeometry(

at com.jme.renderer.lwjgl.LWJGLRenderer.draw(

at com.jme.scene.TriMesh.draw(


I've checked the renderer code, and it sets the 'limit' of the buffer too high for the second (lightmap) buffer. Also I tried to set the second buffer to the same size as the first, but then the second texture doesn't maps right. So I guess that I should be using the same amount of coordinates for both buffers, so I should apply the texture  for the lightmap matching the placing of the textures from the first object.

Am I right on my guess? Or maybe I'm doing that wrong? I've checked the multitexturing examples, and what they do is to copy the texture coordinates also, but didn't find any example which does that any other way.

Thanks in advance! :slight_smile:

Using Geometry.copyTextureCoordinates(0,1); will maintain the limits for you.

I've tried that already. What I've got is the lightmap texture tiled all around but not as on the 'lightmap' object. The coordinates are different.

I've checked both of my objects (obj format). Both are identical, and have the same amount of 'vt' entries. Maybe the problem is that the loading processing does generate different results for each, and that's why I do have different sized texture buffers?

EDIT: I've tried to load the same object twice, and use the second copy as if it were the lightmap. Also got the same problem, so maybe is that the processing changes some of the trimesh order, and that's why I can't process them correctly, as I do rely that they're in the same order.

EDIT2: Definitively the issue is that the processing does store the trimeshes in different order (a java 'feature' I guess, probably something to do with some hashtable?). I've launched the app several times without modifying absolutely nothing. Most times simply does crash, but a couple of times did work, one of them showing the lightmap perfectly fitted, and the other with a wall with the wrong mapping.

I doubt any of the loading will randomly reorder your trimeshes. What type of "loading" are you using?

And on an offtopic note (since I doubt any of our loaders store coordinates in hash tables), while storing things in a hashtables and then iterating it (which is typically a bad idea) can reorder your elements, this is not "random". If you store elements in the same order, the result will always be the same. This 'feature' is hardly unique to Java either.

I do use the ObjToJme converter (basically the same code that I found on the example) for loading both objects.

I’ve done another test. I set on a loop which iterates until both objects do same the same count on the buffers on each trimesh, so I don’t have to execute the application until it works. I do get two different results, most of the times the textures aren’t mapped as they should (the lightmap I mean), looks like they’re on the wrong place. A few times they do appear ok.

That’s the (almost, not fully) right scene:

That’s the not so right scene:

This is the same code, the same ‘run’, but recreating the same scene several times (each time, different results).


I’ve uploaded a quick (and dirty) test here (source code + data):


Tora is my friend programmer, we're doing same project. :slight_smile:

That test scene I made it.

I am exporting with Gile, a comercial(50$) cheap tool for just lightmapping. (It has quite good radiosity, that's why i bought it)

giles has a curious way to organize materials/lightmap, so I am gonna try a different  kind of export now. Also, I have (again, used some $) Ultimate Unwrap, which also exports ASE. Gonna export from there, 'cause could be compatibility issues, or just "way of saving" of each software.

To me that…it's breaking the thing in materials limits.

The scene has different materials. Wall is done by large chunks…each of a certain height (thought for kind of different texturing areas of a kitchen)

OR…it is breaking…in the limit of materials, but as is there where the UV islands end, so, maybe ASE format , or the loader o r jme, can't stand 2 UVs per 3d mesh vertex…That'd be bad. Though probably a weld of coincident vertices could be passed ingame…

Or…cough I did not build smoothing normals, that could have passed to me…if so, the darkeing just in the two tris would have an explanation…as…hmm…no UV or material limit there , so…

yup. To me that …simply at some point the smoothing normals  got broken… probably…even at each triangle side. I dunno. I'll be sending (to you, Tora ;)) more sample scenes to see if we iron it :slight_smile:

Thanks all

(BTW: does jME engine force that Lightmap UV coordinates are the same than channel1 UV coordinates (standard texture)) ? That is no usual in engines and 3d packages, haven't seen any so, actually,  and would not be convenient for strong reasons…and example: Texture tiling is very usual in an scene (this test sample has it, the wall and floor texturing is done by tiny tiles) , but you can't use Uv mapping tiling kind for lightmap in anyway… Forcing to not u se tiling in standard texture channel , for have an usable lightmap UVs in channel 2 of UVs, would be terible for almost all projects… ;)  At the same time, lightmaps…force a kind of way fo doing things…Forget about using mirrored and welded UVs in channel1, as that in lightmaps channel would wreck it all…and if one thing is used, is mirror of UVs in texture channels… (ie, quake models often had only one arm apearing in the UV/texture template…why? cause they had mirrored and welded the UVs so to use more eficiently texture space, as the two arms in that model, could be equally texteured…)

I hope I explain myself in at least average english :wink:

Well, flying to export new scenes for my friend :slight_smile:

Still working on this.

Snaga: No, we can use different texture coordinates on each channel. No problem on that bit.

Just a wild guess… May the trimesh sorting have something to do? Maybe some trimeshes on the same side are getting different indexes on the table, that is, not the same order as the source obj file.

I’ve loaded the same obj twice, and printed the xml output from the converter. I got different xml files each time (on the same run).

I’ve uploaded the output xml files:

Used the ObjToJme converter and the BinaryToXml converter, then wrote the resulting xml file. That was in a loop to generate the xml several times (load obj, save xml).

Any kind of help would be really appreciated, as I’m a bit lost at that point. Maybe there’s another way to find out the trimesh correspondence between the two obj files once processed by jme?

Found a way, comparing the FloatBuffer vertex information, looking for the same values on each trimesh from the lightmap obj. That way seems to work :slight_smile:

Sorry for resurrecting an ancient thread, but I'm new to jme and was searching the forum for information about lightmapping.  Any chance you could post the working code to do this?  Thanks!

jhocking said:

Sorry for resurrecting an ancient thread, but I'm new to jme and was searching the forum for information about lightmapping.  Any chance you could post the working code to do this?  Thanks!

hate to disapoint you, but think these birds have flown the nest, try PM'ing them

good suggestion, I am PMing Tora right now.  In the meantime, does anyone else have code to do this (ie. combine two versions of a model with different texture coordinates into a single lightmapped model)?