Rendering frame buffer to screen with both depth and blending

I am working on elevators in my top-down-view game. I want to keep the camera locked on the opaque player/elevator while the current level translates out and fades away, then next level translates in and fades in.

Like so -
elevator

The green area representing the level has all the complexities and many geometries of the world. Setting alphas individually on all the objects in the level will not look right, because for example a partially transparent wall will reveal objects behind the wall. This can be fixed by rendering the level normally to a frame buffer, then rendering the frame buffer to screen with the appropriate alpha instead.

My problem is combining the frame buffer level render with the player and the elevator which may be both in-front of and behind parts of the level as it moves-

Rendering the fading part of the level as a full screen quad, I lose the frame buffer’s depth data - it renders at the depth of the quad.

Rendering the frame buffer by blitting it does copy the depth, but the process cannot blend the colors or apply alpha.

Is there some way to apply the frame buffer depth to a full screen quad? As far as I know opengl’s depth test is done before the shader, so adding the depth map to my shader and doing my own depth checking feels hacky. Is that the way forward? Or am I down the wrong path entirely?

As a hint: you may want to look into the filter post-processor. That (effectively) renders a full quad with access to the existing frame buffer and depth buffer. You’d require a custom filter shader and a custom filter class… but presuming you already have your overlay image it should work fine.

Presumably, you are somehow re-rendering your overlay level as it moves up or the perspective will be wrong, right?

1 Like

A rough idea would be to write a post-processing filter that takes one additional render in Filter’s postQueue method that renders only opaque geometry, including the skybox, to a color texture and a depth texture. Then the filter material’s fragment shader blends between the actual scene and the opaque scene using an alpha uniform (and comparing depth textures to make sure things are in the right order visually).

Aha ok, I can see FilterPostProcessor passes a DepthTexture parameter to the filter material, which is along the lines of what I was thinking might be needed. Should work, thanks.

2 Likes