Render geometry behind everything else

I want to have a particular geometry that I am rendering manually to always render behind everything else (including skyboxes). I thought rendering the geometry first (using the code below) and disabling depth testing+writing would do the trick, but that results in the geometry being rendered in front. What should I do?

geometry.getMaterial().render(geometry, geometry.getLocalLightList(), renderManager);

Rendering it first would work… but I don’t think you are not rendering it first. I think material.render() is just queuing it up. It’s also super weird to call it directly.

If you can explain what you are actually trying to to do then it might help frame an answer better. (Because it already sounds weird to render something behind the skyboxes… as you might as well not render it all then.)


I’ll admit, it’s all very weird.

The geometry is a billboarded quad that covers the whole screen, and is the output for deferred shading.

I’m having trouble explaining this abstractly, so here is an example:

This scene consists of a hovertank model (opaque bucket), a sphere (transparent bucket), and a skybox (sky bucket). The GBuffer and deferred shading passes render the opaque bucket to a quad stretching over the entire screen. Then other passes render the sky bucket, then the transparent bucket.

My problem is that only the tank is visible, because the deferred shading quad is rendered over everything else in the scene. Here is the same scene, but the deferred shading quad is discarded (the background is green):

Basically, there is a hovertank-shaped hole in the scene, due to the depth texture, with the background peeking through it. If the deferred quad were instead rendered behind everything, the tank would be completely visible through that hole.

(Unless I’m entirely mistaken, this is the way Johnkkk’s deferred rendering works. And I’m working with his code, so…)

Did you paint that area green for posting the screen shot or is it actually rendered that way?

Because if it’s actually rendered that way then whatever is painting green should probably be not doing that.

It’s rendered that way. The green is the viewport’s background.

What do you mean? The green area is where nothing got rendered.

Behind in terms of timing, or behind in terms of depth sorting?

What classes are an option? (where do you want to execute the call?) Filters/Appstates/Viewport?

1 Like

Behind in terms of depth sorting.

I’m making that particular render call as part of the viewport rendering process (RenderManager#renderViewPort) during the deferred shading pass. It’s the same way Johnkkk was doing it, and I’m more or less copying his logic to a cleaner API.

start viewport render
    preframe call
    construct render queue
    postQueue call
    execute framegraph passes
        gBuffer pass
        deferred shading pass (HERE)
        sky pass
        transparent pass
        gui pass
        postprocessing pass
        translucent pass
    framegraph cleanup
end viewport render

Note: My pass objects in the code are called “modules” so they won’t be confused with Johnkkk’s passes. I plan to rename them after cleaning things up.

1 Like

Set the viewport not to clear the background… watch ship appear.

Oops, I provided the link to the wrong branch, fixed now.

Assuming you have the depth texture from the gbuffer pass bound and not cleared, disable depth write and set depth testing to the reversed function. GreaterEqual i think it is.

Render the quad at depth 1 using RenderManager.renderGeomerty()

1 Like

Everything is now visible, but for deferred objects only the emissions are displayed, for some odd reason. The tank still is drawn in front of everything else, regardless of actual depth.

Here is the DeferredShadingModule where this render occurs:

For reference, here is DeferredShadingPass (written by Johnkkk), which works:

If the tank is never rendered behind the the sphere then it must be one of the following:
a) depth has been cleared before the transparent pass
b) depth map has been altered by the lighting pass
c) depth map is not bound
d) forced renderstates have been used and not cleared up

I don’t know how much of the framegraph stuff is tested. afaik it all was a work in progress and “will be fixed later”.
Beside that i think if you want to implement a new technique you do your self not a favor by using a verbose/complex experimental lib as base.

At least the support on that features is going to be zero.

1 Like

I wrote a simple filter to display the depth texture. It shows the depth texture to be in order, and not outside the 0 to 1 range, so I believe that eliminates A, B, and C. (I had set the alpha of the deferred lighting quad to 0.01 to do that, because that quad was being rendered over the post-processing quad). I guess that leaves issue D, which I haven’t tackled yet.

What really bugs me is the lengths we’re going to to make this work, when Johnkkk’s implementation didn’t do anything all that special. It would be really nice to know why his works and ours doesn’t.

I discovered that the issue with only the emissive sections showing up was due to rendering via RenderManager#renderGeometry. When I reverted to using the material’s render method, the problem “went away.” Though, that seriously makes me question the robustness of the deferred lighting shader.

I agree, which is why I wrote my own framegraph API from the bottom up instead of using Johnkkk’s. I’m still using his logic and shaders, but at this point, I’m this :pinching_hand: close to rewriting those, too. Also, I’m not adding anything brand new right now beyond what features Johnkkk already had working.

You mean ViewPort#setClearColor=false, right? I tried that, but it didn’t have any impact.

Purly from looking at the code snippets you posted it seems the shader needs the lightlist supplied. So i guess this is not for rendering back the result but for applying the light calculcations. Dont know what kind of data/location requirements the shader has.

1 Like

The light list is supplied by the GBufferModule through the framegraph using a sort of “parameter connection system.” Every frame before execution the deferred lighting pass pulls the necessary resources, including the light list, from this system. I know it’s working correctly because the tank is being lit (I mean, now that I fixed a different bug).

Edit: Oh, I see what you mean. This:


doesn’t provide the light list, so only the emissions will show up.

In the screenshot the tank does not seem to be lit. If it works with material.render and does not work with rendermanager.render it would hint to the lights, but in the last 15? years of jme dev i never needed to call material.render myself so it smells fishy. (without any reason)

1 Like

I agree, it doesn’t seem proper to directly use the material to render, but I don’t think it is that important right now. I’ve tested with both methods and neither have any impact on the main issue (that objects are drawn in the wrong depth order). For now, I’ve split RenderManager#renderGeometry into two methods, so that I can use the light list from the GBuffer pass for rendering but still render properly via the render manager.

Can you explain more in detail what exactly is wrong with the order? Maybe a picture of the wrong ordering would help too.

ps: if you want to go down the rabbit hole a gl debugger saves you plenty of time guessing whats wrong and writing debug output shaders.

I recently had an issue that was very difficult to debug without Renderdoc. It was a little tricky to figure out how to properly launch the JVM for use with it, but I found that by launching the jME application with the following Gradle config and attaching to the running application with Renderdoc worked beautifully:

task renderDoc(type: Exec) {
    def javaExecTask = tasks.named("run").get()
    def javaHome = javaExecTask.javaLauncher.get().metadata.installationPath.asFile.absolutePath

    commandLine = [
            "--working-dir", ".",
            "-classpath", sourceSets.main.runtimeClasspath.asPath,