Normalmap left/right Y flipped depend on point of view issue - help :)


ok, then everything explained.

issue was that i misunderstand normalType.

i observed changing it gived something like intensity, but its not.

cant we just add something like normalIntensity into PBR material? Similar to last change about AmbientLight, but just a param about normalIntensity this time.

i think it would help a lot, because instead re-bake normals each time you think it have too low or too high strength is time consuming. i think it would help a lot.

meantime i will try add something myself.


If you are hacking the PBR shader, you may also be able to get what you want by scaling the Z value of the normalmap and then renormalizing it… after the NORMAL_TYPE calculation.


oh, im looking at shader, and i see:

#if defined(NORMALMAP)
  vec4 normalHeight = texture2D(m_NormalMap, newTexCoord);
  //Note the -2.0 and -1.0. We invert the green channel of the normal map, 
  //as it's complient with normal maps generated with blender.
  //for more explanation.
  vec3 normal = normalize(( * vec3(2.0, NORMAL_TYPE * 2.0, 2.0) - vec3(1.0, NORMAL_TYPE * 1.0, 1.0)));
  normal = normalize(tbnMat * normal);
  //normal = normalize(normal * inverse(tbnMat));
  vec3 normal = norm;

i thought i would have some color to scale, but here i am newbie in shaders to know what to do :smiley:

could you help? :slight_smile:

ahh you mean normalHeight.z * intensity?


…yes, then normalize it again.


something like:

vec3 normal = normalize((normalize( * intensity) * vec3(2.0, NORMAL_TYPE * 2.0, 2.0) - vec3(1.0, NORMAL_TYPE * 1.0, 1.0)));
normal = normalize(tbnMat * normal);



No. You’re scaling x, y, and z… effectively doing nothing because when you renormalize it then they will all go back to what they were.

How much do you understand about how normal vectors work with respect to lighting? Might be time to brush up.


right sorry, i do first, then think;

normalHeight.z = normalHeight.z * intensity
normalHeight = normalize(normalHeight)

just before this lines? im newbiew in shaders, all what i did was discard some pixels, change uv coords, or manipulate color, nothing else.


You are probably safest putting the new code between normal calculation and multiplying it by the tbnMatrix.

  vec3 normal = normalize(( * vec3(2.0, NORMAL_TYPE * 2.0, 2.0) - vec3(1.0, NORMAL_TYPE * 1.0, 1.0)));
  normal.z *= NORMAL_INTENSITY;
  normal = normalize(normal);
  normal = normalize(tbnMat * normal);  

And then setup a material paramter NormalIntensity mapped to a define NORMAL_INTENSITY.


thanks, you are awesome again :+1:

@pspeed anyway, is it possible to put it in future versions of JME with “#if defined(NORMAL_INTENSITY)” to not slowdown shader for anyone who dont use this param?

i understand it would be like:

  vec3 normal = normalize(( * vec3(2.0, NORMAL_TYPE * 2.0, 2.0) - vec3(1.0, NORMAL_TYPE * 1.0, 1.0)));

  #if defined(NORMAL_INTENSITY)
    normal.z *= m_NormalIntensity;
    normal = normalize(normal);

  normal = normalize(tbnMat * normal);

or is this too “hacky” to make it into JME?


It’s a little hacky. I also worry that maybe it will do strange things in some cases at triangle seams since we are changing the direction that the normals point.


so i might got some issues too if i use values like 2 or more?

hmm, i thought there will be some always-working solution for normal intensity ;/

now im not sure how to make “age” param, or how to manipulate all normals intensity of my textures. i can risc with this, but if you say there might be issues…


You are going to be better off sticking to values 1 or less for intensity. ie: scale it down but not up.

It should be ok for the cases I can think of since the edge normals will line up… but given that the direction of the normal map might be different from one triangle to the next, I can imagine strange cases where it would make seams.


im curious how blender and others did it?

maybe i should just scale normalMap image color intensity? it would take a lot of cpu, but always work?

or maybe in shader mix normalMap with default normalColor ? (the more mix - second tex, the less normal visible)

in Blender i can give even value of 200, their “strength” param works nice


No, it’s the same issue. We are taking a vector and squashing it in Z. It “should” line up fine with other vectors squashed in Z.

…but it would make me uncomfortable to build in a non-physically-correct thing into PBR that may not work right in all cases.

I’m starting to feel like maybe we need a “NotPBR” shader that’s basically a hacked-to-death version of PBR… or maybe someone can finally convert it to shader nodes and then this would just be a node.


if you say so, i belive. i thought mixing default normal value with normalMap colors would work.
but if you say not, then ok.

is there some trick in Materialize they have “pre contrast” and “final contrast”.

isnt this just a contrast? i would like do something similar to manipulate image then.

also i found some link:

isnt this the same as blender normal strength param?

im shader noob, so i just ask.


They are basically doing the same thing I suggested. It’s probably fine.


i just setup normalType to 1, i might miss something, but why i still see same issuethere :frowning:

i need to search because maybe its my in-code issue somewhere, but i think i made normalType 1 everywhere.

value 1 is default(JME proper one) right?



oh my god…

@pspeed could you tell me what i do wrong?

  • Both models use SAME material(normalType 1),
  • it is SAME(head is just part copy paste) mesh.
  • same way exported

see screenshots, i cant belive overall mesh have issue, while copy pasted head dont?




back to the topic.

based on what you said @pspeed i understand normalType is some kind of Y value. (green channel?)
you provided comment:

//The type of normal map: -1.0 (DirectX), 1.0 (OpenGl)

but as i understand it should affect just light direction, not the issue i had? (one side had light from up, and other had light from down).

Anyway here is some proof this works odd. I hope that is me who did something wrong, but i tested a lot and im not sure why head of same mesh, same way exported and same material give different results.

for one model i need use -1 to have proper lighting, for second i need 1…

and its trully same mesh, just one is only head copy pasted

im not sure if its blender who randomize it, or JME.

(both use same material it also mean same textures)

exported all mesh(copy paste all mesh into new blend file):

exported just head(copy paste head mesh into new blend file):




Please note i think its Blender issue. (but still not sure)


when i generate Tangents via SDK -> scene -> model -> generate tangents

for both.

then when i use normalType -1 (i dont think -1 is default?)

then both models work correctly for same normalType.

but idk why -1? anyway…

@pspeed could you tell me, is -1 correct default value?

if yes, then looks like blender is doing something odd, where it export different tangent Y value alike for each above models.

i think -1 is default, because when i remove param from material text, and reload, then its same result as -1 value.

so this looks like Blender tangent generating is random…

Anyway im not quite sure why Y value make both sides different lighting. but since you agree its Blender issue, then im fine, if all models will work same if i generate Tangents Via SDK.