TangentBinormalGenerator, could someone explain to me what it does?

Hi,

I try to learn the usage of shaders at the moment. In my program I generate the buffers (vertices, texCoords, normals, indices) for a flat area made up of some quads. That works, but when I used the lighting material on it (as in the material-tutorial) this happened:

Only when I added

TangentBinormalGenerator.generate(mesh)

everything looked ok (a flat gray pixel lighted plane).

But I have no clue why. I thought correct normals would be sufficient to calculate the lighting? How works the TangentBinormalGenerator?

Best,
Stefan

1 Like

Do you use normal maps or parallax maps? They require tangents.

You can probably read more about what that means if you do a web search for “tangent space”.

Otherwise, your mesh was messed up before and somehow running it through the generator must have fixed it… but it shouldn’t have.

1 Like

To be honest I never did understand why you need tangents for normal mapping as well, at least to most computer grafic papers this is not necessary, and the varying normal + the offset from the normal map should suffice.

Yes, I used a normal map in the example. That explains why I needed the TBGenerator.

Thanks for the hint! I found this tutorial which explains tangents and bitangents some more.

Normal maps are almost 100% of the time, in tangent space (a frame of reference localized for each vertex), but they could be created in any space you want. When you do light calculations, you need to do them in the same space. Your light/camera vectors won’t be in the correct space.

A normal map gives you the Z axis, but to form a basis, you need 3 axes (the other 2 being tangent and binormal). The generator creates tangent vectors for you, and the bi-normal is normally created in the shader (cross product between the 2). Then you can create a tangent space matrix which you can use to transform the light vectors

5 Likes
@wezrule said: Normal maps are almost 100% of the time, in tangent space (a frame of reference localized for each vertex), but they could be created in any space you want. When you do light calculations, you need to do them in the same space. Your light/camera vectors won't be in the correct space.

A normal map gives you the Z axis, but to form a basis, you need 3 axes (the other 2 being tangent and binormal). The generator creates tangent vectors for you, and the bi-normal is normally created in the shader (cross product between the 2). Then you can create a tangent space matrix which you can use to transform the light vectors

This. Could have been a response to empire, also.

Regular normal maps are based on the baseline of the texture space being the same as the tangent vector. Without tangent space then you don’t know what direction to project the normal vector from the normal map. You can’t just use world or model space because the triangle could be rotated, etc…

Without special normal maps I don’t know how you’d do it without tangents, really.

Thanks everybody! I understand why they are necessary now.

@Empire Phoenix said: To be honest I never did understand why you need tangents for normal mapping as well, at least to most computer grafic papers this is not necessary, and the varying normal + the offset from the normal map should suffice.
That's like saying "we don't need the Z axis in a 3D world". Normals in a normal map are usually in tangent space because the normal in the map will still be correct if the model is deformed (like in bone animation). Normals for each vertex in a model buffer are in model space, that's the Z axis of the tangent space. Now we want to compute this tangent space normal we got from the normal map in view space (because that's where we compute lighting). But we only have one axis...so there is no way to make a transformation matrix.

Only way is to generate tangents (and binormal but they are computed later in the shader), to have a real coordinate system.
Tangent and binormal follows the U and V coordinates of your normal map, so you can generate them from normals and texture coordinates…with some crazy maths…

If you seen some normal mapping implementation without tangents, that’s probably because tangents in the map were not in tangent space (some implementations use model space, but if the mesh deforms the normals are wrong, some even use world space but the object has to be static…).

1 Like

any reason to it not be automatic?

if (geom.getMesh().getBuffer(VertexBuffer.Type.Normal) != null) {		
  TangentBinormalGenerator.generate(geom);
}

if performance, even a boolean could be set to prevent it happening (at Mesh) in case there is any reason to not compute it, is there any?

Several reasons for the general case:

  1. it’s only needed if you plan to use normal mapping or parallax mapping
  2. it may already have tangents
  3. it may be time consuming, so…
  4. tangent generation should be done as a preprocessing step and baked into the model
  5. it makes something magic that you should definitely be aware of “Why does my mesh triple in size?”
  6. For certain meshes (I’m looking at you block world games), it’s easier to just set the tangents when the normals are set.

The SDK will run it for you automatically it seems wen you set a material but you can also manually run it as I recall. Just load the model in the SDK, run tangents, save it. Done.

3 Likes