[Solved] Parallax Mapping in custom shader

Hello jmonkeys,

I am currently trying to implement parallax mapping in my voxelshader, however I cannot seem to get it working.
since I already have normal maps implemented and working I guess there is no problem with the TBN matrix I got in the shaders, but to sum it up (and since there might still be a bug in there that I just dont notice with the directional light / normal map thing mentioned further down)
when meshing, instead of putting normal, tangent and bitangent into vertexbuffers, I put a single byte indicating which direction this face is facing. like 0 for faces facing direction negative x, 1 for faces facing direction positive x and so on.
first assumption I made is that in the vertex shader when I see this byte is a “0” I can just define the TBN matrix like so:

if (inNormal < 0.5) {
    t = vec3(0.0,0.0,1.0);
    b = vec3(0.0,1.0,0.0);
    n = vec3(-1.0,0.0,0.0);
} else if (inNormal < 1.5 //... same theory for 5 other faces
    //could later be replaced with fancy chains of step() calls to get the desired result
}
TBN = transpose(mat3(t, b, n));

to get a matrix that transforms worldspace to tangent space, completly independant on the order of the vertices. so if I had a quad

2--3
|  |
0--1

I could define the triangles as 0, 1, 3 and 3, 2, 0 as well as 1, 3, 2 and 2, 0, 1 and the TBN matrix should still work, right?
second assumption is that as long as my model is neither scaled not rotated, direction vectors in worldSpace and modelSpace are the same, is that correct?
that would mean the TBN matrix I create can also transform direction vectors from modelspace to tangent space

so now onto the actual parallax thing:
in the vertex shader, after calculating the TBN matrix I want to calculate viewPos and lightDirection (thats the direction the sun shines into) in tangent space so I multiply them with the TBN matrix I just created

    tangentViewPos = TBN * g_CameraPosition;
    tangentLightDirection = normalize(TBN * m_SunDirection);

and since I read somewhere that interpolating linearly in tangent space might mess things up I also send the fragments position to the fragment shader but in worldspace:

    worldFragPos = vec3(g_WorldMatrix * vec4(inPosition, 1.0));  

then in the fragment shader I want to get the fragments position in tangentspace so I can calculate the viewDirection in tangentspace:

    vec3 tangentFragPos = TBN * worldFragPos;
    vec3 tangentViewDir = normalize(tangentViewPos - tangentFragPos);
    vec3 finalTexCoord = parallaxMapping(texCoord, tangentViewDir);

with parallaxMapping defined as:

vec3 parallaxMapping(vec3 texCoords, vec3 viewDir) { 
    const float height_scale = 0.8;

    float height =  (1.0 - texture(m_ParallaxArray, texCoords).r) * height_scale;  
    return vec3(texCoords.xy - (viewDir.xy / viewDir.z * height), texCoords.z);
} 

I followed LearnOpenGL - Parallax Mapping so I guess the parallaxMapping(…) function should be working and thats also why I’m doing it all in tangent space, but trying to fix my problem I had a look into lighting.vert, lightig.frag and the instancing.glslib and it seems they are doing it all in viewspace, is there any advantage in one over the other?

And to explain what the above code results in:
the textures definitely move and they also do it somewhat smoothly and move faster when looking from a lower angle (ofc they only move when moving the cam), but for some faces the texture moves with the camera (like when I’m facing a block and I move the cam to the left the texture moves to the left, too) and for other faces the texture moves into the opposite direction, however none of these cases look like working parallax, more like a scrolling background in 2D games

As mentioned, the diffuse light component I get by dotting tangentLightDirection with the normal that I get by multiplying the TBN with the values from a normalMap, looks like its working.
Although I have to say it definitely works when I assume the normal I lookup in the normal map is (0,0,1) and then continue with multiplying it with the TBN and dotting with the tangentLightDirection.
However, when I read the normal from the normalMap and continue with this value (mult with TBN, dot with tangentLightDirection), some faces seem to have the normals inverted in that it looks like the face is lit from the right side while actually the sun is on the left side.
these normalmaps were created with an online tool NormalMap-Online
and when checking the color values in gimp they give me something around 128, 128, 255 for areas that should be flat, so thats what I would expect

Does it make any sence? Can anyone help me out?
Thanks in advance and once more greetings from the shire,
Samwise

Had similar problem myself while following that exact learn opengl tutorial. g_CameraPosition is in world space, while the tbn matrix from that tutorial converts from model space to tangent space. So multiplying the g_CameraPosition with g_WolrdMatrixInverse before multiplying with tbn fixed it for me.

thanks for your answer.
I tried it out but unfortunately it didn’t fix it.
can you confirm that “I could define the triangles as 0, 1, 3 and 3, 2, 0 as well as 1, 3, 2 and 2, 0, 1 and the TBN matrix should still work, right?”, because otherwise I start to think the bug might be in the order which I define the triangles in.

If you multiply viewPos (for example) by TBN then this is not doing what you think it’s doing (I think).

Either viewPos is already in world space and you want it in world space (in which case there is nothing to do) or it is in world space and you want it in tangent space… in which case you’d need to multiply by the inverse.

…but why not just do parallax in world space?

Edit: put another way, if you had a position in tangent space and wanted it in world space THEN you’d multiply it by TBN.

as always, thank you for your answer!
in the tutorial I (somewhat) followed they do it like

vec3 T   = normalize(mat3(model) * aTangent);
vec3 B   = normalize(mat3(model) * aBitangent);
vec3 N   = normalize(mat3(model) * aNormal);
mat3 TBN = transpose(mat3(T, B, N));

vs_out.TangentLightPos = TBN * lightPos;
vs_out.TangentViewPos  = TBN * viewPos;
vs_out.TangentFragPos  = TBN * vs_out.FragPos;

and since I thought when I define the T B and N vectors like I do, and since my model is neither inequally scaled nor rotated, my TBN vectors dont need to be multiplied with model matrix (world matrix I guess) and they are normalized already, so I went straight for

TBN = transpose(mat3(t, b, n));

however, when I do

TBN = inverse(transpose(mat3(t, b, n)));
tangentViewPos = TBN * ((g_WorldMatrixInverse * vec4(g_CameraPosition, 1.0)).xyz);
tangentLightDirection = normalize(TBN * m_SunDirection);

the parallax seems to almost work, but the normals are inverted like when sun shines from the top the bottom face of the block appears brighter.
so when I define tangentLightDirection like

tangentLightDirection = normalize(TBN * ((g_WorldMatrixInverse * vec4(m_SunDirection  , 1.0)).xyz));

This seems fixed (well at least it looks better), which makes me wonder why it makes a difference when I first translate sunDirection to modelspace, and then to tangent space while I actually thought direction vectors are the same in modelspace as well as worldspace if my model is not scaled / rotated.
Could you by any chance please ‘confirm’ or ‘bust’ the 2 assumptions I made (marked in bold) because I base a lot on them
In the meantime I’ll assume that direction vectors in modelspace are not the same as worldspace, and see if I can do it all in worldspace as you suggested
EDIT: however I’ll keep assuming I can get the T B and N vectors the way I do and I can define the vertices in any order I want and that wont change the TBN matrix (it obviously wont but I mean it also shouldnt)

Thanks again

I wonder what the transpose is for.

I don’t really have the mental bandwidth to reverse engineer what’s going on. For my own code (in the IsoSurfaceDemo triplanar mapping), I found it 100x easier to do everything in world space… especially when trying to triplanar map multiple materials.

Also, not looking at the rest of the code, I assume it remembers that in TBN space, z is ‘up’, ie: the height map is manipulating z. I just remember that back when I did it that was a constant source of bugs.

well in the tutorial they used the transpose (because the transpose matrix of a matrix that has all perpendicular unit vectors is the same as the inverse but way cheaper to calculate, you probably know), just I thought they use it because of opengl uses column major matrices and the TBN would have to be defined row-wise.
that also means inverse(transpose(mat3(t,b,n))) is the same as mat3(t,b,n) i guess.

In the meantime I managed to make it work in worldspace (although I still transfer viewDir to tangentspace because I dont get how I would offset texcoords in worldspace (or how it makes sence since the texcoords for the texture lookups need to be in tangent space))

I guess the bug was in how I defined the T B and N vectors, I just had to invert some to make it work although I dont really get why since it now looks inconsistent to me but it works. I guess its because of how I mesh my chunks which was why it was important to know if it makes a difference in which order I define the vertices.
I’ll now try to implement the steepParallaxMapping that parallax.glslib does since it looks like its smarter dependant on the camera position

Anyway, I consider it solved so thanks again for helping me out!

If you ever want to look at how I did it and are willing to try and read past the triplanar mapping part, here are the shaders:

I don’t remember how “pretty” they are but they work.