How do I grab the depth buffer?

I have a typical 3d scene (jme3), but want to experiment with blending the near part of the scene gradually into another scene.

So I guess I’ll need to render 2 scenes to an offscreen buffer, find the depth buffer of the first scene, create an alpha channel from it to layer onto the second scene.

It’s similar to how fog works I suppose, I’ve been looking at the fog demo files and into the fog filter and filter post processor classes - but I’m not sure how it all works.



Any help or simple example code appreciated, thanks.

to get the depth buffer you just have to render the scene to a framebuffer and attach a depth texture to it like this



FrameBuffer frameBuffer = new FrameBuffer(w, h, 0);

frameBuffer .setDepthBuffer(Format.Depth);



Texture2D sceneTexture = new Texture2D(w, h, Format.RGBA8);

frameBuffer .setColorTexture(sceneTexture );



Texture2D depthTexture = new Texture2D(w, h, Format.Depth);

frameBuffer .setDepthTexture(depthTexture);





then you can attach the depthTexture to any material (mat.setexture(“myTextureKey”,depthTexture))

the texture will be sent to the shader and then you can do everything you want with it.

2 Likes

that’s brilliant nehon thanks, I’ll get back if I have any problems…

Hi

I’m having a little trouble displaying the results,

I’ve been mapping the sceneTexture and depthTexture onto an orthogonal Picture attached to the guiNode, and also onto a full screen quad, but I just get very garbaged results,

If I can get the results onto a basic full screen quad, I should be able to take it from there,

thanks for any help.

Look at how it’s done in the filter post processor, don’t attach the quad to the node, because its gonna be rendered again during the render of the scene



Look at the renderProcessing method of the FilterPostProcessor, it renders the texture to a full screen quad

http://code.google.com/p/jmonkeyengine/source/browse/branches/jme3/src/core/com/jme3/post/FilterPostProcessor.java

thanks nehon, will look into this…

Okay - the only thing I can’t get is how to render the scene into the framebuffer you mentioned



FrameBuffer frameBuffer = new FrameBuffer(w, h, 0);



is it something like

renderer.setFrameBuffer(frameBuffer);



but then nothing shows up…

No, you need to tell the viewport to render the mainscene into your frame buffer like this



viewPort.setOutputFrameBuffer(frameBuffer);





do this when you init your processor (i guess you have a processor)



then in the postFrame method the texture you attached to your frameBuffer will contain the rendered scene.

ok thanks for that I’ll see if I can get it working.

hi, got it all working, thanks for the help :slight_smile:



just one little thing, when I use fog in the scene it blanks out my depth buffer, just wondering what the cause of this is…



here’s my general set up of processors etc



viewPort.addProcessor(fogpp);



viewPort.addProcessor(myProcessor);



myProcessor.postFrame(frameBuffer);



viewPort.setOutputFrameBuffer(frameBuffer);

mhh…

if you change the order what append?

tried swapping it all around in different combos, but still an all gray depth buffer.

the problem is that there a 2 processors that render the scene to a frame buffer…i don’t think it could work



ok reading back your first post I realize maybe I misguided you…

What you should have done is a Filter and not a processor…



the bad new is that it would have been a lot simpler…

the good new is that you learned a lot :smiley:



So…what you did, is render the scene and depth to a frame buffer, render an offscreen scene and mix them with a shader on a full screen quad.

Except for the offscreen part that’s what filters do.



if you create your own filter, override the requiresDepthTexture method and return true (look at how it’s done in the light scattering filter for example)

doing this the depth texture will be added as m_DepthTexture to the material of the quad.



Doing your own filter will allow you to stack filters.

yes I was thinking I’d have to do my own filter, and yes I’m learning a lot!

thanks again,

no doubt I’ll be back soon…

1 Like

well what I’ve done is grab the m_DepthTexture from the fog filter that’s already there,



Material fg=fog.getMaterial();

Texture t=fg.getTextureParam(“m_DepthTexture”).getTextureValue();



all seems to work fine for me…

good!!

well i’m just writing my own shader now, but it doesn’t seem to load up, i get this error,

SEVERE: Uncaught exception thrown in Thread[LWJGL Renderer Thread,5,main]

java.util.NoSuchElementException

at java.util.Scanner.throwFor(Scanner.java:838)

at java.util.Scanner.next(Scanner.java:1347)

at com.jme3.material.plugins.J3MLoader.loadFromScanner(J3MLoader.java:509)



i have the j3md, vert and frag files in my matdefs folder for my project, and all my file paths etc seem set up correctly, but don’t know what the problem is.

there must be an error on the path to the files could you post your J3md, and the material init in the filter?

okay,

here’s the material definition, it’s just a solid color shader at the minute for testing…



Material zb = new Material(assetManager, “MatDefs/zblend.j3md”);

zb.setColor(“m_Color”, ColorRGBA.Red);



// zblend.j3md

MaterialDef zblend {



MaterialParameters {

Vector4 m_Color



}



Technique {

VertexShader GLSL100: MatDefs/zblend.vert

FragmentShader GLSL100: MatDefs/zblend.frag



WorldParameters {

WorldViewProjectionMatrix

}

}

}





// zblend.vert

uniform mat4 g_WorldViewProjectionMatrix;

attribute vec3 inPosition;



void main(){

gl_Position = g_WorldViewProjectionMatrix * vec4(inPosition, 1.0);

}



// zblend.frag

uniform vec4 m_Color;



void main(){

gl_FragColor = m_Color;

}

i’m half way there,

when I was creating the files I was creating just ‘empty files’ and calling them j3md, vert, frag files,

it seems I have to create an 'empty material definition ’ file, and it seems to be working better now,

i don’t understand the difference though, on the surface everything I was doing seemed correct.