Noob - texturing a procedural mesh questions

Hello monkeys,

I have created a procedural mesh (succession of empty tubes) by feeding the vertexes and indexes to a mesh.

Works perfectly, and I can build the succession of pipes easily, change their width, length, amount of “sides” and so on and all ends up in one mesh with no doubling of vertexes. A walk in the park for most of you but a feat for lil me :).

Now, I want to texture it. Got a couple questions if you don’t mind:

  • I guess I need to create normals/texture coods and feed them to the mesh and then call tangentBinormalGenerator?

  • if I do that (precedent point) and add a texture, it will be repeated/displayed following the texture coords?

  • I need one normal per index and not per vertex, because one vertex can be linked to multiple triangles?

  • all normals linked to a same triangle are the same?

  • texture coords are per vertex or per index? I would think per index, but think I saw an example where it was per vertex (may be my memory messing).

  • if a mesh is made of 2 quads (made by 2 triangles each) that are combined (I mean, 6 vertexes), I can “extend” a texture to the two quads by using intermediate values for the texture coords in the middle, so that I doesn’t tile but “extends”?

  • is there a way to add uv-coords or are the texture coords there for that?

  • I saw there is a showNormals.j3md but the test case (helloAssets) doesn’t seem to display the normals… is there a way to see them when I launch the application?

    I’m asking because I’m having troubles finding a way to calculate the normals (I know about the cross product) and texture coords and so I’d like to not be going in the wrong direction.

    Sorry for all the questions… tried to make them yes/no answers for easiness.

    Thanx :).

(P.S. you may wish to split the vertex, give both the same normal and different texture co-ordinates. That’s also fine and will give you a smooth lighting effect but a different texture).

Do you want smooth or abrupt joins?

If you want a smooth shape then you do actually want to have a single vertex with one normal, that normal then gets interpolated between your vertices so you end up with a round effect.

For example:




(Ignore the …, they are just for space, the ', is a diagonal line.)

In this case you have a right angle corner, so the normal goes out at 45 degrees from it. The result is a smooth curving effect (it doesnt change the physical shape but it does change the lighting and give the illusion of smoothness).

By contrast if you split the edge and use two vertices you can then give each a different normal:




Now you will get an abrupt sharp corner in both the lighting and the geometry.

The same applies for texture co-ordinates. If you want them to be different for different vertices then you need to split the vertex. If they can be the same (possible for many textures and objects if you do the mapping right) then you don’t.

All data like that is always associated with the vertex though. The triangle indexes are just for creating triangles and not really anything else.

1 Like
  1. Yes, texture coords and normals for texturing, tangents for normal maps and advanced lighting
  2. Between some vertices it will display the “range” of the texture that is defined by the texture coords of the vertices, if there is no coords or they are all “0/0” you get a mesh fully colored like the first pixel of the texture
  3. No, if you need multiple normals you need multiple vertices afaik
  4. No, depends on how your mesh looks. Imagine the normals on a sphere.
  5. see 3)
  6. Thats kind of the point of texture coordinates if I understand you right
  7. UV Coords == Texture Coords
  8. The ShowNormals material is deprecated afaik but you can try and just add little arrows (Arrow debug geometry).

Lol 7 minits :D.

Thx a lot normen :). This helps tremendously.

I was so proud of having reduced the amount of vertexes to the minima haha… oh well.

@loopies said:
Lol 7 minits :D.

Thx a lot normen :). This helps tremendously.

I was so proud of having reduced the amount of vertexes to the minima haha... oh well.

Np. Be sure to check out if you can map them via the indices too, I am not 100% sure about that.

Thx zarch… very interesting point you bring to my attention :). Maybe I’ll go with soft and see how it goes.

I went with only doubling the vertex on one “side” (line?) of my tubes by using a “soft” approach to normals.

Also added a method to display the normals using those arrows. Couldn’t have corrected them without the arrows.

Thx, very happy of the results. I can happily apply transparency, normal and glow maps and specular. I’ll deal with shades later on… when I have something to get in the way of the light.

Just here to point to a smashing vid that helped me to understand how to create good seams in blender:


Is there a difference between:


Material mat6 = new Material(application.getAssetManager(), “Common/MatDefs/Light/Lighting.j3md”);

Texture textureMat6 = application.getAssetManager().loadTexture(“Textures/cross_pipe3.png”);

mat6.setTexture(“DiffuseMap”, textureMat6);





geometry.setMaterial((Material) application.getAssetManager().loadMaterial(“Materials/Crosspipe.j3m”));


With Crosspipe.j3m content being:

Material My Material : Common/MatDefs/Light/Lighting.j3md {

MaterialParameters {

DiffuseMap : Textures/cross_pipe3.png



I’m asking because when I use the Crosspipe.j3m material, uv-textures are applied correctly, but some faces are incorrect when using the first solution. Applying to a blender model, btw.

There shouldn’t be a visible difference I would expect. I don’t use j3m files myself though. Someone else might have to help you on why you are seeing a difference :slight_smile:

One thing to be aware of is that actually the second will be a little more efficient than the first if you use multiple geometries with the same material…since the material will actually be shared between those geometries (unless of course you have your own code to store and re-use your generated Material instance.

1 Like

The only difference might be that the texture gets flipped in one case and not the other. I think loadTexture() flips them by default but I can’t remember for sure… and the j3m requires explicit flipping.

I could be completely wrong, though.

1 Like

Thx zarch. Different group of tubes can share (or not) materials as I am currently doing it. I guess I could share materials originated from j3ms too.

Thx pspeed. Using debug and looking for the flipping you mentioned, I saw that indeed, a flip is happening when using loadTexture.

Using a TextureKey allows to set the flipping. So I did as follows and works. Thx!


TextureKey tk = new TextureKey("/Textures/cross_pipe3.png", false);

Texture textureMat6 = application.getAssetManager().loadTexture(tk);


As I understand it application.getAssetManager().loadMaterial - will already try to return the same Java Material object each time you load the material. It’s only if you are creating a new Material() object rather than loading a j3m that you need to worry about handling the sharing yourself.

@zarch said:
As I understand it application.getAssetManager().loadMaterial - will already try to return the same Java Material object each time you load the material. It's only if you are creating a new Material() object rather than loading a j3m that you need to worry about handling the sharing yourself.

They are cloned in any case. It's the compiled shader inside that is shared when possible.
@loopies said:
Also added a method to display the normals using those arrows. Couldn't have corrected them without the arrows.

any chance you wouldnt mind sharing your method? i would like to try to turn it into a utility class. probably wouldnt be hard to add an option for tangents as well.
i am having difficulty learning custom/procedural meshes, so i can use all of the reference tools i can get
:D great post by the way, its clearing up alot of stuff i have been wondering about.

Code to display normals (works with any Geometry, be it procedural or imported by blender):


public void createArrow(Vector3f location, Vector3f direction, ColorRGBA color) {

Arrow arrow = new Arrow(direction);

Geometry g = new Geometry(“arrow”, arrow);

Material mat = new Material(application.getAssetManager(), “Common/MatDefs/Misc/Unshaded.j3md”);

mat.setColor(“Color”, color);





public void showNormals(Geometry geometry, ColorRGBA color) {

VertexBuffer position = geometry.getMesh().getBuffer(Type.Position);

Vector3f[] positionVertexes = BufferUtils.getVector3Array((FloatBuffer) position.getData());

VertexBuffer normal = geometry.getMesh().getBuffer(Type.Normal);

Vector3f[] normalsVectors = BufferUtils.getVector3Array((FloatBuffer) normal.getData());

for (int arrow = 0; arrow < normalsVectors.length; arrow++) {

createArrow(positionVertexes[arrow], normalsVectors[arrow] , color);




This could definitively be made more generic and optimized, but gives you a starting point.

Also, each normal ends up as an object so you may want to do something about that (batching?) because you quickly get lots of them and lots of objects is a really good way of making your computer work :D. Also, if geometries are moved, the normals are displayed at the original position of the vertexes so you may want to apply all transformations to the arrows. I create my meshes where I want them so I don’t need to move them so not a problem for me. A utility class would be nice indeed.

Answers I got here helped me a lot… glad you got uses for them too :). Since I studied marketing, I have low 3D and math skills haha.

edit: seems I moved arguments in the create arrow method while writing this. Corrected the order of arguments so they are correct now.


I’m currently playing with appStates and nifty and really liking them.

Kind of stuck on a problem though, so wondering if someone found a nice way of resolving it:

I have a welcomeAppState that uses nifty to display a menu panel on the left and a right panel containing the details linked to the active menu.

I have a playGameAppState responsible for “playing the game”. PlayGameAppState can enable an optionsAppState.

WelcomeAppState and optionsAppState should use the same options.xml nifty.

I haven’t found a way to reuse that options.xml in those 2 contexts.

Also, looking for a way to load different xml files for each menu of the welcome screen in a way that doesn’t force me to write the menus in all screens of the welcome.xml… nor do I want a bloated welcome xml that loads all panels and hides/show panels depending on the selected menu… and hoping to use xml to load the content of those pages

I’ve come to with the following conclusions (after testing/searching):

  • there is no include xml tag/tool in nifty.
  • AddXml does not merge/override panels and while it can be used to load an xml for each different menu, it does force me to rewrite the menus in all screens
  • a custom control would not help me reuse the options.xml file nor would it really help me not rewriting the menus in all screens of the welcome.xml.

    What may work but I’d prefer not doing:
  • using xslt or something to merge the xml files during a maven goal or a build or something (I don’t use maven currently, for my lil game)
  • using the nifty java api to build the screens (may be the best solution though?)

    Anybody has an idea? I may be obsessing a lil too much here (or simply missing something), but I do think it would make my 2D interfaces code cleaner.

    A nifty include tag would be awesome here and resolve all my troubles.

    Not a rant against nifty… I really like it.

    NB: can move this to somewhere else if you think it would be better… kind of thinking of having a thread where people can learn from all the troubles I am bound to go through :D.

Is it normal that when I ask for the worldPosition of a box geometry I added to a node that is controlled by a vehicleController, it returns me the position of the vehicle node instead of it’s child? Even though it displays the box correctly, not at the origin of the vehicleNode?
Happens in the update method of an appState, in case that has any influence.
Is there something I can (or should) do about it?

How did you create the box? Did you create it with the mesh offset or did you add it to the vehicle then move it using setLocalTranslation() and/or move() ?

1 Like

Thx zarch!

I was creating my box with an offset and adding it to the vehicleNode. Your post made me try adding the box with no offset and doing a setLocalTranslation after adding it to the vehicleNode and now I’m getting the correct location for the box.

I’m a little worried that that means I’m really not understanding something important here because I don’t see why it would make a difference, but it gives me the correct result and a path to investigate :).
Thx alot!

In case needed, I can provide the code, if this behavior is not intended.