Hi!
I noticed that TerrainLighting.frag and Lighting.frag have different normalmapping.
TerrainLighting.frag (have a look at n.y):
[java]
#ifdef NORMALMAP
normalHeight = texture2D(m_NormalMap, texCoord * m_DiffuseMap_0_scale);
n = (normalHeight.xyz * vec3(2.0) - vec3(1.0));
n.z = sqrt(1.0 - (n.x * n.x) - (n.y * n.y));
n.y = -n.y;
normal += n * alphaBlend.r;
#endif
[/java]
Lighting.frag (have a look at normal.y):
[java]
#if defined(NORMALMAP) && !defined(VERTEX_LIGHTING)
vec4 normalHeight = texture2D(m_NormalMap, newTexCoord);
vec3 normal = (normalHeight.xyz * vec3(2.0) - vec3(1.0));
#ifdef LATC
normal.z = sqrt(1.0 - (normal.x * normal.x) - (normal.y * normal.y));
#endif
//normal.y = -normal.y;
…
…
…
[/java]
So, normal.y is commented and n.y is uncommented.
Thanks, fixed in SVN
@Momoko_Fan , why are normals calculated in every normalMap texture? You can just blend Normalmaps (as DiffuseMaps) and then calculate normals for one blended texture.
12 calculations of the same normals are very big for productivity.
I implemented some king of Texture Blending into LightBlow shader. You can have a look at it.
Okay this is fixed too
@Momoko_Fan , I would suggest to make:
[java]
…
n1 = texture2D(m_NormalMap, texCoord * m_DiffuseMap_0_scale).xyz;
normal.rg = mix(normal.rg, n1.rg, alphaBlend.r);
…
n2 = texture2D(m_NormalMap_1, texCoord * m_DiffuseMap_1_scale).xyz;
normal.rg = mix(normal.rg, n2.rg, alphaBlend.g);
…
n3 = texture2D(m_NormalMap_2, texCoord * m_DiffuseMap_2_scale).xyz;
normal.rg = mix(normal.rg, n3.rg, alphaBlend.b);
…
//INSTEAD OF
…
normal += n * alphaBlend.r;
…
normal += n * alphaBlend.g;
…
normal += n * alphaBlend.b;
…
[/java]
You will need only red and green channels of normal maps. Blue channel is always constant.
And add channels is not good for lighting. It would be better to mix them.
My version of texture blending:
http://code.google.com/p/jme-glsl-shaders/source/browse/assets/Shaders/LightBlow/?r=d097a2700d84f741b779fe7663ac16cf9e12f88b