[SOLVED] How can I create world-space depth fog in my game?

Like everyone, I am writing a bloxel game. Depth fog, to obscure the world loading, can significantly improve the aesthetics of my game, so I attempted to add it. jME’s built-in fog filter is insufficient because it creates fog that starts fading in at the camera and then never fully fades out. I want fog that starts fading in near the edge of the view distance, and then becomes completely opaque at the view distance. After much digging through the tiny amount of documentation on jME’s filters exists, along with the filters in the Github repo, I managed to create a functional filter that renders fog mostly as I want:

#import "Common/ShaderLib/GLSLCompat.glsllib"
#import "Common/ShaderLib/MultiSample.glsllib"

uniform COLORTEXTURE m_Texture;
uniform DEPTHTEXTURE m_DepthTexture;
varying vec2 texCoord;

uniform vec4 m_FogColor;
uniform float m_FogStartProportion;
uniform float m_ViewDistance;
uniform vec2 g_FrustumNearFar;

const float LOG2 = 1.442695;

void main() {
    //Find linear depth (based off learnopengl.com's depth butter tutorial)
    float zBuffer = getDepth( m_DepthTexture, texCoord ).r;
    zBuffer = zBuffer*2-1;
    float near = g_FrustumNearFar.x;
    float far = g_FrustumNearFar.y;
    float linearDepth = (2*near*far)/(far+near-zBuffer*(far-near));

    //apply fog
    float currentDistanceFraction = (linearDepth)/m_ViewDistance;
    vec4 unfoggedColor = getColor(m_Texture, texCoord);
    if(currentDistanceFraction < m_FogStartProportion){
        gl_FragColor = unfoggedColor;
    }else if(currentDistanceFraction > 1){
        gl_FragColor = m_FogColor;
    }else{
        float fogDensity = (currentDistanceFraction-m_FogStartProportion)/(1-m_FogStartProportion);
        gl_FragColor = mix(unfoggedColor, m_FogColor, fogDensity);
    }
}

With this filter I get a very strange artifact. The fog seems to be in a plane paralell to the camera’s plane instead of in a sphere around the camera. This artifact is very noticible when the camera moves left or right:

The presence of this artifact suggests that the linearDepthvariable that my shader finds is not actually the distance between the camera and fragment in world space, but rather some sort of orthographic projection thing. As such, it seems that I need to somehow get the true world space coordinates of the camera and the fragment and perform the distance formula, but I am at a loss for how to do this. I found some mutterings about how to get the world space coordinate of a fragment, but I found nothing regarding the camera (I prefer to use something built-in to jME to keep my code clean, without passing the camera position all over, as this is implemented via a FilterPostProcessor filter).

Is there some easier way to fix this issue than to find the world space coordinates? If not, how can I find the world space coordinates of the camera and fragment?

You will probably be sooooo much happier if you just add fog to the real shaders instead of trying to do post-proc fog. But that’s just my advice.

A few lines in the .vert to add a camera local Z varying and then a few lines in the .frag to calculate fog however you like… using whatever fog color you like. Probably 5-6 lines of code in total after forking the shaders (if you haven’t already).

This has the added benefit of letting you still have a real sky with fog. Also being able to have some materials (like lights) that don’t use the fog… and then using a black fog color at night to give cool night time distance effects.

It’s a real shame that Lighting.j3md was never converted to shader nodes because this would be super simple then.

Edit: the other cool thing is that you can base fog on camera distance instead of just depth… that way you won’t see farther near the edges of the screen. (Ever turn your head sideways to see farther in Minecraft? Yeah, then you know what I’m talking about. :))

I meant to address the technique that you’re describing in my question, but I forgot. I want to avoid forking of materals as much as possible, as the jME materials are good for 95% of what is needed, and I am using jME instead of LWJGL to avoid mucking around with GLSL as much as possible. For this reason, a filter that can apply to all materials period seems like the best option, as it is very much set-it-and-forget-it when programming (ie. I don’t need to rewrite the fog each time I use a new material).

Edit: the other cool thing is that you can base fog on camera distance instead of just depth… that way you won’t see farther near the edges of the screen. (Ever turn your head sideways to see farther in Minecraft? Yeah, then you know what I’m talking about. :))

That same effect is what is happening in my game (as is visible in the video), and I want to write a filter that fixes it somehow.

Try this to convert your depth value, where depthN is depth value directly from texture (don’t do depthN=depthN*2-1).

float a = g_FrustumNearFar.y / (g_FrustumNearFar.y - g_FrustumNearFar.x);
float b = g_FrustumNearFar.y * g_FrustumNearFar.x / (g_FrustumNearFar.x - g_FrustumNearFar.y);
float dist = b / (depthN - a);

Disclaimer: I just copied this from my jme shader, so there’s a good chance it might work, unless I customized something else :smiley: .

I just tried it. Unfortunately, I still get the exact same effect.

I think reverse engineering distance from just depth and screen coordinates might be tricky in a shader. I haven’t thought deeply about it, though.

Ok, I just checked, the two methods for computing linear depth are identical. So at least, the problem is not there.

Then the culprit can be g_FrustumNearFar it might use post processor camera which might be completely different camera to your scene.

Thus, remove the use of g_FrustumNearFar, replace it with m_FrustumNearFar and set it yourself.

As I was trying to figure out how the various variables are passed into shaders in jME I saw m_FrustumNearFar in a few places, but it wasn’t clear where it came from. It seems strange to manually set it via material.setVector2, but I can’t think of an alternative. I was also under the impression that m_FrustumNearFar was equal to g_FrustumNearFar, however this now appears to not be the case. Can you clear any of this up?

In jme, shader uniforms prefixed with g_ are set automatically by jme and uniforms prefixed by m_ are set by the user.
If you don’t want to be confused you can name it m_MyFrustumNearFaretc

In that case, what is the purpose of using a m_FrustumNearFar if it will contain identical information as g_FrustumNearFar? When I was trying to figure out filters, I replaced the g_FrustumNearFar with hard-coded values equal to my frustums and it had no change, so I concluded that they are the same values.

It will not contain the same data, since it (most probably) contains the FrustrumNearFar of postprocessor’s camera.
You need the value of your’s scene camera.

You 3D scene has a perspective camera.Thus, all g_ variables that depend on the camera will be set according to it.

Now, assuming this is the issue, a post processor filter, renders a full screen quad. How does it do it? A new orthographic camera? Thus shaders, that write post processor effects will then get their g_ variables filled with the postprocessor’s camera.

Ps: let me know if it fixes the issue

Any approach that doesn’t use trig functions is going to be giving you parallel depth. Basically, it’s the dot product with the view vector. Anything involving near/far planes is only changing the “units” basically… from view space to world space.

It’s not going to fix the parallel problem. For that you have to reverse engineer the hypotenuse of the triangle formed by x,y,depth.

1 Like

makes sense. after all the depth value represents distance from near plane to fragment, not from camera to fragment.

So… add a projectionMatrixInverse to your shader, etc:

Matrix4f mat = cam.getProjectionMatrix();
mat = mat.invert();
material.setMatrix4("ProjMatInv", mat);

Inside shader, compute the distance to camera:

float depthN = getDepth(m_DepthTexture,texCoord).r;
vec4 clip = vec4(texCoord*2.0-1.0, depthN*2.0-1.0, 1.0);
vec4 view = m_ProjMatInv*clip;	
float dist = length(view.xyz/view.w);
2 Likes

Thank you! This is exactly what I was looking for :smiley:

2 Likes

Hi @john01dav,
Would you mind sharing your solution?

My solution is the one that The_Leo posted here.

1 Like

Can you maybe share the rest of the shader code? The vert and frag shader and the java class?

Please man?

1 Like