Light problems with procedural terrain

Hi guys,
I have written my own class for procedural terrain that uses TerrainQuads (64x64 each) and places them next to each other. My problem ist related with the lighting. The quads are dark the their y axis (thinking in 2D), but in at their x-axis. This does happen with DirectionalLight, even without shadows but NOT with AmbientLight.
I thought the reason might be that the terrain overlaps but the quads still become dark at their northern and southern edges (but not east and west) if I scale them down to create seams between them. Actually this happens to SOME but not ALL quads.
Does anyone have an idea where I could search for the reason? I would post some code if that helps but since I have no idea where the problem could be I wouldn’t know WHAT to post…

I suspect it might be the alpha map (I am using MaterialLighting.j3md) but it still occurs when the alpha map has the same value everywhere. Using an unshaded material I have no problems… EDIT: Writing the alphamaps in a file they look right.
Sorry if my English sucks btw :stuck_out_tongue:



Thank you for every piece of help!

It’s the normals, I guess. That’s what makes “lighting” “lighting”.

If this is using JME’s terrain quads then the data needs to overlap at the edges so that the terrain seamlessly transitions from one quad to the next.

Maybe you messed up the normals of the terrain. You can easily check this when you let the light shine into different directions, like [+x, -x, +y, -y, +z, -z]. It should be lit with one of these 6 directions.
(or read the normals from the terrain quads and compare it to the light direction)

Viewing around some more in my terrain I saw that this is only the 4*7 quads in the center of the map. everything more outside is okay.
I changed the direction of the light and the problem still occurs, but is about 10 units higher along the y/z axis. If it’s the normals, what can I do against it? (Sorry for being such a noob…)

Have you checked ALL 6 directions? I agree with pspeed, your problem should be with the normals…

I tried all light directions (with negative y) and it occurs everywhere, but not that strong with a negative z value.
(0.5, -1, 0.5)

(0.5, -1, -0.5)

(0, 0, 1)

(1, 0, 0)

(-0.5, -1, 0.5)

(-0.5, -1, -0.5)

If I should try another direction, tell me :wink:

Ok, I’ve never used JME terrain and I thought the tutorials covered this… but I’m pretty sure if you want a 64x64 terrain quad then you need to give it 66x66 worth of data so that the EDGES OVERLAP.

Edit: might be 65x65… as said, I’ve never used it but you must absolutely 10000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000% have overlapping edges somehow. Search the forums, I guess.

The quads actually are 65x65 (Tutorial and documentation say it has to be n^2+1).
I think if it was because the quad size it would be all over the map, not just in the center. Maybe I just let the player start at 1000x1000 or something but that would suck :smiley:

EDIT: Maybe I might have done something wrong in my material creation… I don’t think so but maybe you see something:

        Material mat = new Material(assetManager, "Common/MatDefs/Terrain/TerrainLighting.j3md");

        //Create alphamaps
        Texture alphaNormal = this.createAlphaMap(quad, 0);

        //Create diffusemaps
        Texture diffuse0 = assetManager.loadTexture("Textures/beach.png");
        Texture diffuse1 = assetManager.loadTexture("Textures/grassAndSoil.png");
        Texture diffuse2 = assetManager.loadTexture("Textures/rockBlocky.png");
        //Alpha 0
        mat.setTexture("AlphaMap", alphaNormal);
        mat.setTexture("DiffuseMap", diffuse0);
        mat.setFloat("DiffuseMap_0_scale", 8);
        mat.setTexture("DiffuseMap_1", diffuse1);
        mat.setFloat("DiffuseMap_1_scale", 8);
        mat.setTexture("DiffuseMap_2", diffuse2);
        mat.setFloat("DiffuseMap_2_scale", 8);


As far as I remember the JME terrain uses textures for the elevation model, and in that case you don’t actually have to care about the texture size. This is because of the interpolation between texture pixels by the shader.
I also never used the JME terrain system, but let’s just once more check for the normals, ok?
What I wanted to say: check the (1,0,0), (-1,0,0), (0,1,0), (0,-1,0), (0, 0, 1), (0, 0, -1) and see if ONE AND THE SAME edge changes from “cannot see it” to “can see it”.
Reduce the problem to a single terrain block if it’s easier for you.

It’s hard to tell from your images, since they don’t tell anything about the actual orientation of the terrain. No one can tell from just the images, but if I look at the shadows in the last 2 images from your previous post, I can see a difference: The shadow appears to have “jumped” from one terrain block to the other, if you know what I mean.

For some of those values the edges are too dark, for others they are too BRIGHT.

If you have the time maybe you want to download it and look yourself:

You can press return and type “setLightDir [float] [float] [float]” to change the direction of the light and “fly” to turn flying on and off. Sorry if the performance sucks, I did not work on that very much yet.

EDIT: Somethign else: Flying around in the exported game I relatively fast crash because I turn out of memory, what did never happen in the SDK…

Sorry, I don’t have that much time but

sounds like you have the normals messed up. Do you write the normals buffer by yourself? You can try and set them constant for each vertex (ideally, in the reverse light direction). Play around with it, I’m sure you’re able to find the bug.

Thank you, I will have a look at it. Actually I do not write the normals manually and I did never work with it but… I try :smiley: Maybe it hast to do with the DistanceLODCalculator?

Nope. It had nothing to do with that.

It is definitely messed up normals on the edges. The edges have messed up normals. So the normals on the edges are messed up making the lighting on the edges messed up because the normals on the edges are messed up.

Did I mention that this is all about the normals and that the normals on the edges are messed up?

As to why it only happens on part of the map… must be that the normals are only messed up on part of the map. Or that when they are messed up elsewhere “messed up” looks ok.

Lighting only works with proper normals. Normals are EVERYTHING in lighting… well, and the light direction.

Okay, I understood :smiley:

I will look at it tomorrow, it’s almost middle of the night here and I don’t think I am able to learn something new today… :wink:

If you want to know what you’re doing, which makes everything easier for you, search the internet for “vertex normals” or “normals and light”.
I have found something for you, which explains the whole lighting model quite well, especially for beginners. Be aware, the JME implementation uses a different syntax, but you will find out if you follow the tutorials.

Normals and Light (external link)

JME tutorials

I know how normals work (I used to work with Blender very much), I just did never care for them while programming. First thing I will do tomorrow is trying how it looks when I apply a normalmap I created for the texture and if that does not help I will go deeper… But for now I will play a little and sleep then :stuck_out_tongue:

Normal map != vertex normals… but a normal map will require CORRECT vertex normals to even work right.

So, basically, your current plan: “So my foundation is crooked. I guess I will try adding new carpets to the floors to see if it will fix it.”

A bad idea maybe… Okay, I will work on the normals tomorrow :wink:

Okay guys, I just want to let you know that I found it (a really dumb thing):

I set up the Neighbourfinder wrong - I thought the “getDownQuad” was y-1. When I tried using y+1 as downQuad and y-1 as topQuad I had no problems. It was reproducable with changing the neighbour left and right so I think this was the problem :slight_smile: