I have a filter with multiple passes that I apply to my scene and I’d like to access the framebuffer holding the result of the filter. Is there an easy way to do this?
So far I’ve tried:
- Using the framebuffer’s in the postFrame() method of the filter class… which explicitly states this is not what I want
- Creating a viewport, attaching the Filter, then attaching a SceneProcessor after that. In the scene processor’s postFrame() method I use the provided FrameBuffer out parameter
- Using the same as attempt 2, but instead of using the out parameter, I grab the viewport’s outputFrameBuffer().
All ways have resulted in the frame buffer being filled with the scene before the filter passes are applied
I have also tried accessing viewport.getOutputFrameBuffer() after each of these methods in the main update loop:
They all resulted in a framebuffer filled with the clear colour of the viewport
Any help would be much appreciated!
You could look at the source for the screen shot app state:
I don’t know if it will help. But I know sometimes these things are written and don’t capture the post effects and then get fixed to capture them. You could also look at the video capture state in the same package. I don’t know if they do it differently or not.
The thing is, if your filter is the last in the filter chain, it’s never rendered to a framebuffer, it’s rendered to the screen. Unless your viewport is setup to render to a framebuffer instead of the screen.
The way to do what you want is to create your own frameBuffer and set it to the viewport (viewport.setOutputFrameBuffer(yourFrameBuffer);). then add the filter post porcessor to this viewport.
The filters will be rendered to this frame buffer instead of the screen output.
Then you can do whatever you want with this frameBuffer
I tried your suggestion pspeed, but unfortunately the results were the same as I stated above. Looking at the source code it’s pretty much the same thing I am doing already.
I tried as you suggest, but it’s not working out how I expect. I’m not sure exactly what I am doing wrong, but here is the process I am using.
- I set up pre viewport to which renders my root node into an MRT framebuffer
- I set up another viewport, which takes the MRT and applies the filters, and set it output to another framebuffer
- I use a screen processor to capture the output from the second viewport’s buffer using .getOutputFrameBuffer()
Result: The output of the framebuffer is just the clear colour of the viewport in step 2. Seemingly, the filters are not being applied and rendered into the getOutputFrameBuffer() framebuffer
Note: If instead of applying the filter to the viewport in step 2, I apply the filter the main viewport, the main rendering on screen is what I expect.
Here is my source code:
// FIRST VIEW PORT, RENDERS SCENE AND FILLS MRT BUFFERS
firstPassViewPort = renderManager.createPreView(“firstPassViewPort”, cam.clone());
// set other viewport field of view
float h = firstPassViewPort.getCamera().getFrustumTop();
float w = firstPassViewPort.getCamera().getFrustumRight();
float aspect = w / h;
float near = firstPassViewPort.getCamera().getFrustumNear();
h = FastMath.tan(CameraManager.FIELD_OF_VIEW * FastMath.DEG_TO_RAD * .5f) * near;
w = h * aspect;
firstPassViewPort.setClearFlags(true, true, true);
waterParticleFilter = new CFWaterParticleFilter(assetManager);
Texture2D realDepthData = new Texture2D(1280, 720, Format.Depth);
FrameBuffer firstPassViewportOutputFrameBuffer = new FrameBuffer(1280, 720, 1);
// WATER VIEW PORT: TAKES MRT VALUES FROM FIRST VIEWPORT AND PUTS THEM THROUGH FILTER
waterPassViewPort = renderManager.createMainView(“waterPassViewPort”, cam.clone());
waterPassViewPort.setClearFlags(true, true, true);
Texture2D realDepthData2 = new Texture2D(1280, 720, Format.Depth);
FrameBuffer waterPassViewPortOutputFrameBuffer = new FrameBuffer(1280, 720, 1);
Texture2D colorTex = new Texture2D(1280, 720, Format.RGB8);
CFFilterPostProcessor fpp = new CFFilterPostProcessor(assetManager);
waterPassViewPort.addProcessor(fpp); // NOTE:: CHANGE THIS TO VIEWPORT.ADD TO SEE THE OUTPUT OF THE FILTER
viewPort.detachScene(rootNode); // root node is handled by the first pass viewport
// our main viewport is now doing nothing. It is relying on the first pass viewport
// to render everything, and then the water view port to apply the filtering
// Create screen shot machine
videoRecordingManager = new VideoRecordingManager(stateManager, this);
I’m not too sure why the output from the waterPassViewPort.getOutputFrameBuffer() is just the filled ColorRGBA.red colour
I tried your suggestion pspeed, but unfortunately the results were the same as I stated above. Looking at the source code it's pretty much the same thing I am doing already.
Didn't work how?
I'm not sure what else to say. ScreenshotAppState _definitely_ captures post-processing effects because I save screen shots that way all the time and they definitely have all post effects in them. So I must have misunderstood something and really nehon is the expert in this area.
pspeed - What’s happening is it’s capturing the scene after it’s initially rendered, but before any filters are applied to it. From what nehon said about the last filter being rendered to the screen, this makes sense. Since I only have 1 filter in my scene, it’s the last one and therefore it’s never actually getting rendered to any frame buffer, it’s going straight to the screen.
What’s confusing me now though, is in my current setup I’ve set up 3 viewports and still the results aren’t as I would hope:
FirstPassViewPort - Renders the rootnode to a MRT framebuffer I set up
WaterPassViewPort - Has a filter applied to it, which processes the MRT textures, and outputs the result (this is what I want to capture)
DefaultViewPort - The default viewport created in an app. The rootnode is detached. So what I am getting on screen is nothing… well, except the clear colour (in my case a blue colour)
Using this setup here are the results of screen captures:
- The scene without any filter applies, this is expected since the filter is on the water viewport
- Just the clear colour, this is unexpected since the filter is on this viewport I would expect it to be the filtered image
- Just the clear colour, this is expected because it doesn’t do anything
If I change the filter to be on the default viewport, I get this output (captured using print screen):
This is the image I expect the waterviewport to have, and I am not sure why it doesn’t have it.
I have four post-processing filters that are various levels of enabled or disabled at different times. The last one in the chain is a filter that just renders a radial fade over the entire scene.http://i.imgur.com/iAVrT.png
If I attach the screen shot app state and take a screen shot with that last post-processing filter enabled, then I get a picture of it. Note the darkened screen in this screen shot:
So there must be something about my setup that’s different or there’s something specific to your setup that’s making it fail. I assume you tried taking a screen shot with the screenshot state and are referring to that when saying that it isn’t working.
…or this is all irrelevant and even if screen shots are taken correctly it’s a separate issue. I don’t know.
pspeed, you have made my life complete, it’s working now. This thing was driving me up the wall!!
The problem was I was using the screenshot appstate wrong. I used it like this:
ScreenshotAppState ssas = new ScreenshotAppState();
When used like that, it goes into SceneProcessor mode and ignores the filters.
However, when used correctly like this:
ScreenshotAppState ssas = new ScreenshotAppState();
The filters are all applied.
I never would have found that if you hadn’t constantly insisted something was whacked out on my end hahaha. Thanks very much!
Also, pretty slick menu
For those who have actually played the game, it’s an evil evil tease. :evil: When they right click on an object they get a menu without the “Physics” item.