Smooth Lighting Problem

I am having trouble getting smooth lighting to work in Java Monkey Engine.

My terrain is voxel-based, with an iso-surface generated via Marching Cubes over the data field.

Up until now, I have been simply calculating the triangle-face normals, and using only those when creating the mesh for each chunk. This produces the faceted look typical of this approach.

Now, I am iterating over each triangle in the chunk, and adding the triangle-face-normal to each vertex normal in the triangle. Vertices are shared between triangles. As I understand it, this should smooth out the lighting, and reduce the faceted look. I am not seeing these results.

Here is a snippet which illustrates the vertex normal processing:

[java] for( UUID blockId : chunk.blockIds )
{
Block block = BlockStore.getInstance().getBlock( blockId );

        FastTable<Triangle> blockTris = block.getTris();
        if( blockTris == null )
            continue;

        for( Triangle tri : blockTris )
        {
            javax.vecmath.Vector3f triFaceNormal = (javax.vecmath.Vector3f)( tri.norm.clone() );

            tri.getVert0().norm.add( triFaceNormal );
            tri.getVert1().norm.add( triFaceNormal );
            tri.getVert2().norm.add( triFaceNormal );
        }

        for( Triangle tri : blockTris )
        {
            tri.getVert0().norm.normalize();
            tri.getVert1().norm.normalize();
            tri.getVert2().norm.normalize();
        }
    }[/java]

Then the blocks/triangles for this chunk are iterated over again, and the component values of all the vertex normals are added sequentially to the “normal buffer”, which is fed into the mesh creation.

Here is what the mesh data looks like, using the JME ShowNormals material, which color-codes things according to normal direction. You can clearly see the facets:

And here is the code which builds the chunk mesh, after the buffers have been populated:

[java] chunkMesh.setBuffer( VertexBuffer.Type.Index, 1, BufferUtils.createIntBuffer( uberIntBuf ) );
chunkMesh.setBuffer( VertexBuffer.Type.Position, 3, BufferUtils.createFloatBuffer( uberVertBuf ) );
chunkMesh.setBuffer( VertexBuffer.Type.TexCoord, 2, BufferUtils.createFloatBuffer( texBuffer ) );
chunkMesh.setBuffer( VertexBuffer.Type.Normal, 3, BufferUtils.createFloatBuffer( uberVertNormBuf ) );

    TangentBinormalGenerator.generate( chunkMesh );

    chunkMesh.updateBound();[/java]

You can see in this screenshot, that the individual triangles do have some blending, indicating that the summation of a vertex’s surrounding triangle-face normals, into the vertex normal, is having some effect.

I’m not sure what I am doing wrong.

It wasn’t clear from your post but if you are sharing vertexes then you need to share normals also. vertex:normal:tangent:color:etc. area all 1:1:1:1:1. The index buffer then indexes into them.

One thing I can tell for sure from your picture is that normals and vertexes are not shared. So I’m not sure how you are doing that.

1 Like

Thanks for your response. At a point in the processing, before the buffers are built and set into the Mesh, I am sharing vertices. Each triangle references an already existing vertex object, which is tied to a specific coordinate, or creates a new one if it does not exist yet. So in other words, several triangles, which all have a coincident coordinate, all share the same vertex object for that coordinate.

Also, each vertex has it’s own normal, so inherently, vertex normals are also shared this way.

As I understand it, in order to populate the buffers, one must iterate over the triangles, and write out the component pieces linearly.

So for example, given the vertices:

Vertex 0 = Coord (A,B,C) Normal 0 (x,y,z)
Vertex 1 = Coord (D,E,F) Normal 1 (x,y,z)
Vertex 2 = Coord (G,H,I) Normal 2 (x,y,z)
Vertex 3 = Coord (J,K,L) Normal 3 (x,y,z)
Vertex 4 = Coord (M,N,O) Normal 4 (x,y,z)

And the triangles:

TRI 0 = Vertex 0, Vertex 1, Vertex 2
TRI 1 = Vertex 0, Vertex 2, Vertex 3
TRI 2 = Vertex 0, Vertex 3, Vertex 4

The vertex buffer would look like this:

A,B,C,D,E,F,G,H,I,A,B,C,G,H,I,J,K,L,A,B,C,J,K,L,M,N,O

And the normal buffer would be:

N0x,N0y,N0z,N1x,N1y,N1z,N2x,N2y,N2z,N0x,N0y,N0z,N2x,N2y,N2z,N3x,N3y,N3z,N0x,N0y,N0z,N3x,N3y,N3z,N4x,N4y,N4z

So there is a 1:1 relationship between vertices and normals.

For the index buffer, I just create and int for each vertex, sequentially:

[java] int totalInts = totalTris * 3;
int[] uberIntBuf = new int[ totalInts ];
for( int intIndex=0; intIndex<totalInts; intIndex++ )
{
uberIntBuf[ intIndex ] = intIndex;
}[/java]

Is this adequate, or am I missing a key concept?

That is the concept. Your mesh is not doing that, though. The vertexes are not shared in the mesh. None of the code that does that part is included so I can’t comment on what might be wrong… but from the picture I can 10000% guarantee that vertexes are not shared between adjacent triangles.

Wherever you have a hard edge, those edges are not sharing vertexes.

1 Like

You were correct. I dug down into my code, and in my Marching Cubes polygonizer, I was inadvertently making copies of the vertex data, which was then fed to the rest of the pipeline. I now have smooth shading, and I learned a valuable lesson regarding the hard-edge thing. Thank you very much!

@adrian-maggio said: You were correct. I dug down into my code, and in my Marching Cubes polygonizer, I was inadvertently making copies of the vertex data, which was then fed to the rest of the pipeline. I now have smooth shading, and I learned a valuable lesson regarding the hard-edge thing. Thank you very much!

Glad you got it working. These things can be notoriously tricky to track down. :slight_smile: (says me, working on my own marching cubes at the moment. ;))

Now I have to figure out what to do at chunk boundaries. Looks like the normals do not match up (where the arrows point in the screenshots). Have you run into this yet?


@pspeed said: Glad you got it working. These things can be notoriously tricky to track down. :) (says me, working on my own marching cubes at the moment. ;))

Mythruna? Or “just for fun”? Sorry for OT.

I have seen that issue… I went the brute force way and calculated the normals based on the mesh and i did not use a lookuptable… The points where my chunks touched i had wrong results because i forgot to add the normals from the neightbour chunk to the calculation.

@adrian-maggio said: Now I have to figure out what to do at chunk boundaries. Looks like the normals do not match up (where the arrows point in the screenshots). Have you run into this yet?

You will need to generate a little extra in your chunks.

@zzuegg said: Mythruna? Or "just for fun"? Sorry for OT.

Side project. I may use it for some aspects of Mythruna depending on how it turns out (some volumetric stuff may be easier if I can’t get clever enough with particle solutions).

I posted some screen shots here: http://hub.jmonkeyengine.org/forum/topic/march-2014-monthly-wip-screenshot-thread/page/3/#post-264510

Edit: P.S.: I plan to post the code somewhere when I’m done.

So I need to generate some data past chunk boundaries, so that there is shared data for neighboring chunks to use?

@adrian-maggio said: So I need to generate some data past chunk boundaries, so that there is shared data for neighboring chunks to use?

That’s what I do. Generate size + N where N is the number of samples you need to properly do your calculations at the borders.

…for me I need size + 3… but it’s dependent on algorithm. :slight_smile:

Thanks. That worked flawlessly… you know, after a few days of work. :slight_smile: