TerrainLighting.frag ang Lighting.frag – differrent normalmapping


I noticed that TerrainLighting.frag and Lighting.frag have different normalmapping.

TerrainLighting.frag (have a look at n.y):



normalHeight = texture2D(m_NormalMap, texCoord * m_DiffuseMap_0_scale);

n = (normalHeight.xyz * vec3(2.0) - vec3(1.0));

n.z = sqrt(1.0 - (n.x * n.x) - (n.y * n.y));

n.y = -n.y;

normal += n * alphaBlend.r;



Lighting.frag (have a look at normal.y):


#if defined(NORMALMAP) && !defined(VERTEX_LIGHTING)

vec4 normalHeight = texture2D(m_NormalMap, newTexCoord);

vec3 normal = (normalHeight.xyz * vec3(2.0) - vec3(1.0));

#ifdef LATC

normal.z = sqrt(1.0 - (normal.x * normal.x) - (normal.y * normal.y));


//normal.y = -normal.y;


So, normal.y is commented and n.y is uncommented.

Thanks, fixed in SVN

@Momoko_Fan , why are normals calculated in every normalMap texture? You can just blend Normalmaps (as DiffuseMaps) and then calculate normals for one blended texture.

12 calculations of the same normals are very big for productivity.

I implemented some king of Texture Blending into LightBlow shader. You can have a look at it.


1 Like

Okay this is fixed too

@Momoko_Fan , I would suggest to make:


n1 = texture2D(m_NormalMap, texCoord * m_DiffuseMap_0_scale).xyz;

normal.rg = mix(normal.rg, n1.rg, alphaBlend.r);

n2 = texture2D(m_NormalMap_1, texCoord * m_DiffuseMap_1_scale).xyz;

normal.rg = mix(normal.rg, n2.rg, alphaBlend.g);

n3 = texture2D(m_NormalMap_2, texCoord * m_DiffuseMap_2_scale).xyz;

normal.rg = mix(normal.rg, n3.rg, alphaBlend.b);


normal += n * alphaBlend.r;

normal += n * alphaBlend.g;

normal += n * alphaBlend.b;


You will need only red and green channels of normal maps. Blue channel is always constant.

And add channels is not good for lighting. It would be better to mix them.

My version of texture blending: