[SOLVED] Gradient fog filter

Hi

I am trying to convert the GradientFogPass from @RiccardoBlb JME pipeline repo to a JME Filter that I can add it to FilterPostProcessor.

I am a noob at filters and shaders. Can someone please take a glance and see if I am doing it right?

Original files:

Pipeline pass:

MatDef:

Frag Shader:

Vertex Shader:

Converted by me:
I have used the default JME fog filter as my reference.

Filter:

public class GradientFogFilter extends Filter {

    private final Vector2f frustumNearFar = new Vector2f();

    @Override
    protected void initFilter(AssetManager manager, RenderManager renderManager, ViewPort vp, int w, int h) {
        /*
        Camera cam = renderManager.getCurrentCamera();
        frustumNearFar.set(cam.getFrustumNear(),cam.getFrustumFar());
        
        Seems camera is null at this point!?
        */

        material = new Material(manager, "MatDefs/Post/GradientFog.j3md");
        Texture texture = manager.loadTexture("Textures/defaultGradient.bmp");
        material.setTexture("FogGradient", texture);
        material.setVector2("FrustumNearFar", frustumNearFar);
    }

    @Override
    protected Material getMaterial() {
        return material;
    }

    @Override
    protected boolean isRequiresDepthTexture() {
        return true;
    }

    public void setFrustumNearFar(Vector2f frustumNearFar) {
        this.frustumNearFar.set(frustumNearFar);
    }

    public Vector2f getFrustumNearFar() {
        return frustumNearFar;
    }
}

MatDef:

MaterialDef GradientFog {
    MaterialParameters {
        Int NumSamples
        Int NumSamplesDepth
        Texture2D Texture
        Texture2D DepthTexture
        Texture2D FogGradient
        Vector2 FrustumNearFar
    }

    Technique {
        VertexShader GLSL150:   Common/MatDefs/Post/Post.vert
        FragmentShader GLSL150: Shaders/Post/GradientFog.frag

        WorldParameters {
        }

        Defines {
            RESOLVE_MS : NumSamples
            RESOLVE_DEPTH_MS : NumSamplesDepth
        }
    }
}

Frag Shader:

#import "Common/ShaderLib/GLSLCompat.glsllib"
#import "Common/ShaderLib/MultiSample.glsllib"
#extension GL_ARB_explicit_attrib_location : enable

uniform COLORTEXTURE m_Texture;
uniform DEPTHTEXTURE m_DepthTexture;
varying vec2 texCoord;

uniform vec2 m_FrustumNearFar;
uniform sampler2D m_FogGradient;

float linearizeDepth(in float depth){
    float f = m_FrustumNearFar.y;
    float n = m_FrustumNearFar.x;
    float d= depth * 2.-1.;
    return (2. * n * f) / (f + n - d * (f - n));
}

float linearize01Depth(in float depth){
        float d = linearizeDepth(depth);
        float f = m_FrustumNearFar.y;
        float n = m_FrustumNearFar.x;
        return (d-n) / (f-n);
}

vec4 sampleWithFog(in sampler2D sceneTx, in sampler2D depthTx, in sampler2D gradientTx){
    float depth = linearize01Depth(texture(depthTx, texCoord).r);
    vec4 fogGradient = texture(gradientTx, vec2(depth, 0));
    vec4 color = texture(sceneTx, texCoord);
    color.rgb = mix(color.rgb, fogGradient.rgb, fogGradient.a);
    return color;
}

void main(){
    gl_FragColor = sampleWithFog(m_Texture, m_DepthTexture, m_FogGradient);
}

Vertex Shader:

#import "Common/ShaderLib/GLSLCompat.glsllib"
attribute vec4 inPosition;
attribute vec2 inTexCoord;

varying vec2 texCoord;

void main() {     
    vec2 pos = inPosition.xy * 2.0 - 1.0;
    gl_Position = vec4(pos, 0.0, 1.0);    
    texCoord = inTexCoord;
}

I seem to be able to compile and run the shader without issue but not sure if the rendering result is correct or not. (I do not have the original repo forked to be able to run the original one to compare the results)

I am mostly considered about these lines and why there supposed to be multiple scene and depth textures instead of just one scene texture and depth texture:

Thanks in advance

Hey Ali,
your implementation seems fine, these lines are there because the pipeline passes are supposed to support multiple scenes at once.
Can you show a screenshot of the result?

1 Like

Thanks, Ricc!

Here is a screenshot of a demo scene

I want the fog to start at 100 meters and continue to the camera far frustum. So I set

fogFilter.setFrustumNearFar(new Vector2f(100, cam.getFrustumFar()));

But the result does not look as I expected:

as you can see fog still starts from the camera position.

Hmm, by the way, without knowing much about the pipeline, but shouldn’t each scene have its own set of filters/render passes?

The implementation seems right, but the fog should be tweaked only by changing the gradient image.
If you want to start your fog from 100m you should have your gradient start as transparent
(x=0 is frustum near, x=width is frustum far)

eg.
if your frustum far is 1000 and the gradient image has 512px width, the gradient should be transparent from x=0 to x=51px and then start from there.

Changing frustum near as you did doesn’t work since the tx coordinates are clamped, so everything below fustum near is automatically x=0 in the gradient

it depends on the pass

1 Like

Thanks, adding transparency at the start solved it. :slightly_smiling_face:

1 Like

By the way, does the order of the fog filter matter inside the FilterPostProcessor? Should I add it before the ToneMap filter or it should be the last one?

This is my current filters stack


// shadow

// SSAO

// Water

// light scattering

// Bloom

// FXAA

// DoF

// ToneMap filter

// contrast filter

I think that would depend on whether you wanted the tone map to apply to the fogged color or not. Reaslistically? No… fog should go last. Artistically? Personal preference and depends on your tone mapping, I guess.

ToneMap after FXX and DoF already feels weird to me.

1 Like

Ok

I guess I had read somewhere that nehon said tonemap should go at end of the filter stack.

Edit:
oops…I was wrong! just rechecked it and it was the TranslucentBucketFilter he said to add last.

Thanks for the hint

It’s like if you have photoshop adjustment layers and filters. The order is largely an artistic choice but probably you want to mess with coloring before you do atmospheric filtering which you do before blur/AA.

…but it depends on the kind of color adjustment you are doing. Whether you expect to be adjusting the color of the “final product” or adjusting the raw colors before adding atmospherics (like SSAO, fog, scattering, etc.). Like, are you expecting to tonemap the foggy blurry image or the base colors… just depends.

re: Translucent filter… it needs to be after water. One of it’s only reasons for existing is to render things after water. The rest seems to depend on what you want those objects skipping.

1 Like