Both images are from the latest test suite. The problem is not new since I've seen it before. Anyone knowing what the issue is?
I can investigate this further tomorrow, if wanted.
can you switch normalmapping off and try again?
Try setting shininess to zero
@Sploreg
Shiniess=0 fixed the problem. I suspect some fixes in the shader needs to be done…
ya it defaults to a higher value. There was a piece of short circuiting, where the shininess lives, that caused issues on ATI cards. The issue was fixed, but the default value is now high and it needs to be reworked.
I set the default back to 0 for the terrain shader and updated the fragment processor to match what was changed in lighting.frag. Hopefully that should help a bit.
@kwando , but you did not try it without normalmaps. Possibly the problem is in Normalmapping + Specular. I have some thoughts about the issue.
Specular should look like here:
That’s why i ask you to test it without normals.
@kwando , if you switch normalmaps off, so there will be no issue, right?
I suppose it could be because of Z channel. “normal.z” can cause such a problem.
What if to try replacing such code in TerrainLighting.frag (starting from line 274) and test it with Normalmaps:
[java]
#ifdef NORMALMAP
n = texture2D(m_NormalMap, texCoord * m_DiffuseMap_0_scale).xyz;
normal.xy += n.xy * alphaBlend.r;
#endif
#ifdef NORMALMAP_1
n = texture2D(m_NormalMap_1, texCoord * m_DiffuseMap_1_scale).xyz;
normal.xy += n.xy * alphaBlend.g;
#endif
#ifdef NORMALMAP_2
n = texture2D(m_NormalMap_2, texCoord * m_DiffuseMap_2_scale).xyz;
normal.xy += n.xy * alphaBlend.b;
#endif
#ifdef NORMALMAP_3
n = texture2D(m_NormalMap_3, texCoord * m_DiffuseMap_3_scale).xyz;
normal.xy += n.xy * alphaBlend.a;
#endif
[/java]
I’ve noticed a similar problem in the Lighting shader. It’s because the normals are not normalized.
If I enable LATC, it looks correct.
hm… strange.
Actually, normals in Lightting.frag are normalized in nightlies (line 214):
http://code.google.com/p/jmonkeyengine/source/browse/trunk/engine/src/core-data/Common/MatDefs/Light/Lighting.frag?r=9328
Isn’t LATC a texture compression format? Or am I wrong here?
@mifth I will see if it solves the problem. But sploregs fix seems to be woking for me
Actally, i confused why it happens now.
LATC is a dds texture compression, as far as i know.
Anyway, Sploreg’s fix works.
I refer to stable:
[java]
#if defined(NORMALMAP) && !defined(VERTEX_LIGHTING)
vec4 normalHeight = texture2D(m_NormalMap, newTexCoord);
vec3 normal = (normalHeight.xyz * vec3(2.0) - vec3(1.0));
#ifdef LATC
normal.z = sqrt(1.0 - (normal.x * normal.x) - (normal.y * normal.y));
#endif
//normal.y = -normal.y;
#elif !defined(VERTEX_LIGHTING)
vec3 normal = vNormal;
#if !defined(LOW_QUALITY) && !defined(V_TANGENT)
normal = normalize(normal);
#endif
#endif
[/java]
Obviously fixed in the nightly, but I wonder why the once so frequent stable updates have ceased. I’m still waiting for some fixes. Seems like I have to switch to nightly.
It may not be fixed in the TerrainLighting shader though, it’s a different shader than the lighting one.
I’ll check that. thanks for the tip.