IsoSurface Demo - Dev blog experiment... (released)

If more than one of you is actually reading these things then let me know.

The recent update saw the porting of the trilinear mapping material.

Instead of porting this one directly, I ported it in stages because the old one had some bugs that I was trying to track down.

Step one was to copy Lighting.j3md and associated files to TrilinearLighting.*.

The next step was just to get a basic texture working. Unlike most meshes, this mesh derives its texture coordinates from 3D space. So the first thing was to turn the texCoord into a vec3, calculate the world position, and set it to the varying. In a previous post, I mentioned that we would want the actual worldOffset of the camera and this is where we need it. Terrain is generated in 0,0 based chunks and then translated relative to the camera. So in order to know the real live world position of any point we need to know what offset to add (ie: the x,z camera location).

After that, I modified the .frag only slightly to deal with the vec3 texture coordinate. Basically, I just arbitrarily use xz to lookup colors.

This produces a very bland terrain. The texture is clearly repeating all over the place and it stretches around the vertical edges:

So, enter trilinear mapping.

Trinlinear mapping takes the world surface normal and then mixes three different textures for the different x,y,z axes. So, the top/bottom has a texture, the north/south has a texture, and so on. It’s important that the world normals are right though and it’s very hard to debug that from a fully textured terrain. So first I just set the surface color to the normal just to make sure it made sense:

Once I’d confirmed that the normals are correct, then it’s just a matter of calculating the blend:


vec3 blend = abs(worldNormal);
blend /= blend.x + blend.y + blend.z;

…then grabbing three colors from the different textures and mixing them together based on that.


    vec4 xColor = getColor(m_DiffuseMapX, m_DiffuseMapX, m_NormalMapX, texCoord.zy, lowMix, normalX);    
    vec4 yColor = getColor(m_DiffuseMapY, m_DiffuseMapY, m_NormalMapY, texCoord.xz, lowMix, normalY);    
    vec4 zColor = getColor(m_DiffuseMapZ, m_DiffuseMapZ, m_NormalMapZ, texCoord.xy, lowMix, normalZ);    
    vec4 diffuseColor = xColor * blend.x
                        + yColor * blend.y
                        + zColor * blend.z;

getColor() is a function I ported directly. It takes two textures, a hi res and a low res, and samples them with different fractal noise and mixes the results together based on the lowMix value. lowMix is set based on the distance from the camera but is never lower than 0.5.

In this picture, I just have dirt for the x and z axes and a sort of stone gray texture for the y axes. Eventually I will replace this with grass but this is what the trilinear mapping looks like so far:

Because of the fractal noise introduced into the texture lookups, the tiling virtually disappears. Furthermore, as we move in to a particular surface we automatically get an increased detail.

Before I went any further, I needed to get bump mapping working. This was the biggest issue with the old shader as it caused a strange brightness to affect the terrain from certain view directions. Basically, I was totally messing up the tangent space matrix. I tried for a while to get a proper tangent basis setup in the vertex shader to let JME’s normal bump mapping work as much as possible but I just couldn’t get it to work. After much debugging, I eventually got an accurate matrix but interpolating the various vectors left odd bright patches on some of the dramatic corners. I will spare you the hundred tangent mapped, etc. screen shots I took trying to track this down.

Ultimately, I bypassed the tangent matrix altogether. The lighting direction is now calculated in view space just like JME does without normal maps. I then calculate the world-based tangents in the frag shader… well, essentially, there it is unnecessary to calculate tangents at all because we sample textures in three different axis directions. I just arbitrarily chose a tangent basis for each one (which is what the GPU gems article does, too, by the way). So I calculate the final fragment normal in world space and then rotate it into view space in the fragment shader. If I ever switch to a world-based lighting direction instead of a viewspace one then I can remove this extra matrix mult in the frag shader. I will do it… but later.

Anyway, here is normal mapping with a test normal map I created that makes it easy to see when things are right or wrong:


(It’s a cool texture for testing bumps.)

So at this point, all that was left was to use a different texture for the top and then add the noise-based bordering around it to give it a nice edge. Why do you need that? Well, here is what straight trinlinear mapping looks like with a grass texture:

It’s "pretty’ but “realistic” isn’t a word I’d ever use to describe it.

But, at least that part was pretty easy to accomplish. Basically, just sample another texture and use one or the other based on wither the normal points up or down. To avoid conditional branching, I do this:


    // Top will be 1.0 if normal is up, 0 otherwise
    float top = step(worldNormal.y, 0.0);

    // Select the top or the bottom texture based on sign of y
    yColor = topColor * (1.0 - top) + yColor * top;   

The part where I give it a border is the most complicated part of this shader. There is a little bit of magic math involved but mostly it makes sense. Here are the basic steps it performs:

  1. Set a threshold value to the worldNormal.y plus a noise lookup offset.
  2. If the threshold is greater than 0.707 (indicating the the normal is facing up more than 45 degrees) the we additionally bias the up mix by 10x the noise offset from above. This was found through trial and error.
  3. calculate some overlapping edge curves using smooth step. The first edge factor will be smooth stepped from 0 to 1 for threshold values 0.5 to 0.72. (From 30 degrees to over 45 degrees). The second edge factor will be smooth stepped from 0.72 to 0.75. These values are multiplied together to get a smooth two sided (lopsided) curve.
  4. The top color is mixed with a very darkened version of itself based on this smooth curve.
  5. The y blend is biased a little more based on the outcome of the edge curve.
  6. Finally, the x axis and z axis textures are darkened a bit as they get closer to the edge.

The final results look like this:

And now we have grass that only grows on the top and has a well defined edge based on surface normal and some fractal noise.

The trilinear shader files are located here:
https://code.google.com/p/simsilica-tools/source/browse/trunk/#trunk%2FIsoSurface%2Fassets%2FMatDefs

I simply commented out the portions of Lighting.* that I’m not using so they are a bit messy in that respect but I tried to document the parts I added as well as I was able.

4 Likes