I can not find some information on normals for meshes in the docs, I am working on globe-terrain-generation for my game Unga Munga: Survival of the smartest. I can access the vertices, modify them and thus generate a landscape. That part works.
But I also need to modify the normals, to make it al look right. As far as I can see, JME3 does not provide a normal update function, so I need to do it myself. But to be able to do so, I need to now how the data is structured.
So is there any information on how to match normals to the faces in the indices buffer? Or are normals related to the vertices (that would surprise me)?
This is a good resource for buffers Custom Mesh Shapes :: jMonkeyEngine Docs
The normal structure¹ (no pun intended) for buffers is as follows
Vertex buffer; a series of floats in groups of 3, each group of 3 is an x,y,z coordinate of a vertex.
Normal buffer; a series of floats in groups of 3, each group of 3 is an x,y,z normal of a vertex.
Texture buffer; a series of floats in groups of 2; each group of 2 is an x,y coordinate on a texture that a vertex has.
Index buffer; a series of ints in groups of 3, each group of 3 defines a triangle (Be careful about making sure they are anticlockwise wound). The value of the index refers to the vertex specified by the other buffers.
Normals are I believe related to the vertices not faces (this makes curved surfaces easier to “fake”)
¹ I say “normal structure” because the shader can actually do whatever it wants, but thats usually what shaders do.
In that case, normals being related to vertices, I am not sure how to calculate a normal. Any hints on that?
If you have flat faces then just set all 3 vertices of a triangle to have the same normal as the face you want to set
That wouldn’t work if normals are 1:1 linked to vertices. They serve multiple faces in my model.
Hmm, looking at Unga Munga it looks like it has curved surfaces, the lighting is always going to be a bit nasty with face normals (as soon as you try to light it it will be really obvious that the “curves” are made of a series of flat faces).
It’s fine for faces to share vertexes, but only if they also have the same texture coordinate and normal. Maybe someone with more shader knowledge might be able to help you but what I would do is reprocess the mesh to have those extra vertexes so faces don’t sharing vertexes where they don’t also share normals (and texture coordinates)
If the terrain mesh is essentially quads (of triangle pairs) then you can calculate it as the average of the face normals that connect to that vertex. Or even just the average of the edges that connect to that vertex (which is usually fairly easy with heightmap terrain, for example).
For non-uniform meshes the answer is trickier.
you can just open Blender and “show normals” there.
Face normal = 4 verts(of this face) normals average normal
usually when you have split 4 verts per face, if you setup same for all 4, then its just like face normal, if you will change some of verts normals, then it will smooth normals between them(there is also in JME generator for tangents - needed for normalmaps)
in general “vertex normal” = “direction of vertex” = “how light will affect this part”
for example for Hair, its good to setup normals so its sphere based (not face direction based)
for terrain, in fact you just need setup vertex normals, so it is average of face normals:
in some experimental, you can even make it “face more toward up” to have different results(triplannar related):
and when you will setup different normals for 4 face verts, you can make something like “simulate hill” for light in shader, example:
(the pink ones are auto-calculated(interpolated) for u)
It’s fun how often 3D graphics is full of questions that seem like they should be simple…
There are several possibilities for calculating normals, depending on visual result you want:
- facet normals (all vertices of a face [triangle] have the same normal) as for a box
- smooth normals (normals are a function of vertex position) as for a sphere
- combinations of (1) and (2), as for a cylinder
If you generate meshes in Blender, then Blender will calculate normals for you.
If you generate meshes procedurally (using Java code), then there’s usually no need to “re-invent the wheel.” For instance, there are algorithms in the Heart library to calculate normals: https://github.com/stephengold/Heart/blob/master/HeartLibrary/src/main/java/jme3utilities/MyMesh.java …
Thanks for all of your responses, I am beginning to see where I could go with this.
@richtea is right in that I am basically dealing with curved faces. What I am trying to achieve is a globe terrain, I have a polysphere with over a 100K evenly distributed vertices.
The mesh has an indices buffer and the vertices are shared. Every vertex is used in 6 triangles - the mesh is organized sort of a hexagonwise way. I have an array with meta data that tells me per vertex the tiangles it is used for,
Reading all this, I conclude that I should calculate the normals for the six connecting faces and use the average of them for the vertex normal.
sounds doable enough. A lot of work, but luckily I have a computer to do it for me
I’ll keep you posted on the results.