[Solved] Z-fighting The Quest

It’s been a constant battle against z-fighting working on jmeplanet. When adding a ocean, I simply create a sphere that intersects with planet’s terrain. This causes major z-fighting as I pulled the camera away. The solution came in two parts.

  1. I modified the terrian’s frag shader to discard anything that is below sea level.
  2. At a certain threshold the ocean node is switched to queuebucket.Sky so it is always rendered behind the planet’s terrain.

    Seemed to work great until I added a moon. Now I have this:

    The moon is behind the planet, but because the ocean is set to queuebicket.Sky it looks like it is in front.

    I’m looking for suggestions on how I can tell the ocean node to always render behind the planet’s terrain node, but render with normal depth within the rest of the scene.

    Or maybe the way I’m dealing with z-fighting is just dumb period and there is a better way?


You can directly control the order of rendering if you register your own GeometryComparator for the bucket you are interested in. I can tell you that you do NOT want to put the ocean in the sky bucket, though.

Can’t you adjust the shader to render flat blue for the ocean and just have the ocean drawn as part of the same geometry as the land? (Or could even influence the shade of the blue based on the depth that would be there)…

Sky bucket = behind everything.

@madjack said:
Sky bucket = behind everything.

And drawn last. Which is the fun part to wrap one's head around. :)

Yeah. I almost wrote: “It’s the last thing being put into place, or the first, depending on how you look at it.” But I decided with a less confusing answer / fact. :wink:


Interesting, but render buckets seem to only effect the whole scene. I need to selectively render the ocean behind it’s own terrain but not behind other more distant nodes (like the moon).

What is happening now is:

  1. Render planet ocean
  2. Render moon terrain
  3. Render planet terrain

    Conceptually what I need to happen is:
  4. Render moon terrain
  5. Render planet ocean
  6. Render planet terrain

    It gets more complex if the moon happen to have a ocean:
  7. Render moon ocean
  8. Render moon terrain
  9. Render planet ocean
  10. Render planet terrain

    This is assuming the moon is actually behind the planet. If the moon was in front everything would need switched.

    Basically I need to render the two layers on a node by node basis… not a whole scene… Maybe I should render to texture then use billboards? Seems a little hackish.


    Originally I did do this but, I want the user to be able to dive through the ocean and see the sea floor. Also I want to use a different material for the ocean then I do the terrain.
@aaronperkins said:
Originally I did do this but, I want the user to be able to dive through the ocean and see the sea floor. Also I want to use a different material for the ocean then I do the terrain.

Maybe you should rethink your strategy here and draw the ocean (as you explained above) only when you get really close to the planet. I think you're going ahead of yourself here. I'm not even sure drawing the ocean as an ocean is really the way to go at this point.

What I would do is draw the ocean as a simple material when view from space, then as you get closer you can switch to different materials.

First, put the ocean in a no sky bucket. That will always get drawn at infinity, essentially.

Second, whatever bucket you stuck it in, add a GeometryComparator that sorts things in the order you want. You can do this by assigning them layer numbers are user data or whatever other strategy you want to use.

I had to do my own sorting in Mythruna since my geometry is large and intertwined and Z sorting was always wrong. I sort by material but you can sort by whatever criteria that you want.


The complexity of the material doesn’t really come into play here. It’s the fact the ocean is a separate mesh from the terrain and horrible z-fighting happens where the ocean intersects with the terrain as the camera moves away. It’s a separate mesh because jme3 doesn’t allow multiple materials per geometry.


I’ll play around with your suggestion. Which bucket would you suggest overriding? I’m trying to make this library general purpose for others to use. Seems a little hackish to override an existing bucket. A bucket someone else might be using for their particular game. Is there a way to add your own bucket without modifying the jme3 source? Thanks!

You aren’t changing the bucket… just how it sorts things inside itself.

Actually, if the water was put in the transparent bucket then it should always be drawn after the opaque stuff. So you could just do that.

maybe you could try with polygonOffset keep the water in the opaque bucket, let the depth do its job and the offset may prevent zfighting at far distance.

By the time they get close to the water the z fighting will no longer be an issue so you can then switch to two separate materials/layers/etc. Trying to keep the same number of objects/detail/etc from space to land is never going to work anyway…

I’ve seem to have found a solution that works great.

I implemented a logarithmic depth buffer in the vertex shader. Basically it more evenly distributes the depth buffer’s precision allowing very far and near objects to be rendered without z-fighting. Works great!

Here is the article:


The only drawback so far is the vertex shader must be applied to all materials in the scene that may interact with the planet, or their depth won’t be drawn correctly.

Thanks to all that helped!

Another way is to have two viewports. One for close objects and another for farther objects, and to enable depth clearing on the first viewport.

Many users here with space games had success with this approach.

@Momoko_Fan said:
Another way is to have two viewports. One for close objects and another for farther objects, and to enable depth clearing on the first viewport.
Many users here with space games had success with this approach.

This may fail also, because z-figthing even starts when rendering the same planet from low orbit - far polygons starts to flicker.

KayTrace is right, the scales involved in rendering a planet mean that z-fighting starts pretty early when pulling the camera away.

The logaritimic depth buffer is rather magical. It just works. Wish it was a standard option in JME3 so I didn’t have to add it to every material manually.

Can someone explain how to implement this in JME3 for a novice monkey ?

Hi there,

I’m trying to implement the logarithmic depth buffer into my terrain program to minimise the zbuffer conflicts. I’ve copied the TerrainLighting.j3md , .vert and .frag into my project.

I assume that the logarithmic code goes into the vertex shader, but being a novice I’m not sureof the format or where I should put it.

This is the code for the vertex shader, the code to insert is

 z = (2*log(C*w + 1) / log(C*Far + 1) - 1) * w  

could someone tell me where it should go ?


uniform mat4 g_WorldViewProjectionMatrix;
uniform mat4 g_WorldViewMatrix;
uniform mat3 g_NormalMatrix;
uniform mat4 g_ViewMatrix;

uniform vec4 g_LightColor;
uniform vec4 g_LightPosition;
uniform vec4 g_AmbientLightColor;

uniform float m_Shininess;

attribute vec3 inPosition;
attribute vec3 inNormal;
attribute vec2 inTexCoord;
attribute vec4 inTangent;

varying vec3 vNormal;
varying vec2 texCoord;
varying vec3 vPosition;
varying vec3 vnPosition;
varying vec3 vViewDir;
varying vec3 vnViewDir;
varying vec4 vLightDir;
varying vec4 vnLightDir;

varying vec3 lightVec;

varying vec4 AmbientSum;
varying vec4 DiffuseSum;
varying vec4 SpecularSum;

  varying vec4 wVertex;
  varying vec3 wNormal;

// JME3 lights in world space
void lightComputeDir(in vec3 worldPos, in vec4 color, in vec4 position, out vec4 lightDir){
    float posLight = step(0.5, color.w);
    vec3 tempVec = position.xyz * sign(posLight - 0.5) - (worldPos * posLight);
    lightVec.xyz = tempVec;  
    float dist = length(tempVec);
    lightDir.w = clamp(1.0 - position.w * dist * posLight, 0.0, 1.0);
    lightDir.xyz = tempVec / vec3(dist);

void main(){
    vec4 pos = vec4(inPosition, 1.0);
    gl_Position = g_WorldViewProjectionMatrix * pos;
    #ifdef TERRAIN_GRID
    texCoord = inTexCoord * 2.0;
    texCoord = inTexCoord;

    vec3 wvPosition = (g_WorldViewMatrix * pos).xyz;
    vec3 wvNormal  = normalize(g_NormalMatrix * inNormal);
    vec3 viewDir = normalize(-wvPosition);

    vec4 wvLightPos = (g_ViewMatrix * vec4(g_LightPosition.xyz,clamp(g_LightColor.w,0.0,1.0)));
    wvLightPos.w = g_LightPosition.w;
    vec4 lightColor = g_LightColor;

    // specific to normal maps:
    #if defined(NORMALMAP) || defined(NORMALMAP_1) || defined(NORMALMAP_2) || defined(NORMALMAP_3) || defined(NORMALMAP_4) || defined(NORMALMAP_5) || defined(NORMALMAP_6) || defined(NORMALMAP_7) || defined(NORMALMAP_8) || defined(NORMALMAP_9) || defined(NORMALMAP_10) || defined(NORMALMAP_11)
      vec3 wvTangent = normalize(g_NormalMatrix * inTangent.xyz);
      vec3 wvBinormal = cross(wvNormal, wvTangent);

      mat3 tbnMat = mat3(wvTangent, wvBinormal * -inTangent.w,wvNormal);

      vPosition = wvPosition * tbnMat;
      vViewDir  = viewDir * tbnMat;
      lightComputeDir(wvPosition, lightColor, wvLightPos, vLightDir);
      vLightDir.xyz = (vLightDir.xyz * tbnMat).xyz;

    // general to all lighting
    vNormal = wvNormal;

    vPosition = wvPosition;
    vViewDir = viewDir;

    lightComputeDir(wvPosition, lightColor, wvLightPos, vLightDir);

      //computing spot direction in view space and unpacking spotlight cos
  // spotVec=(g_ViewMatrix *vec4(g_LightDirection.xyz,0.0) );
  // spotVec.w=floor(g_LightDirection.w)*0.001;
  // lightVec.w = fract(g_LightDirection.w);

    AmbientSum  = vec4(0.2, 0.2, 0.2, 1.0) * g_AmbientLightColor; // Default: ambient color is dark gray
    DiffuseSum  = lightColor;
    SpecularSum = lightColor;

    wVertex = vec4(inPosition,0.0);
    wNormal = inNormal;



You can look at the work I did for jmeplanet.