Is it possible to pass a rendered frame to a shader and have the output of the shader returned to a frame buffer without rendering it to a full screen quad in JME?
I’d like to use shaders for image manipulation… but this has nothing to do with the current scene… so there is no need to render anything.
Eh… found it… nm!
Would be cool if you could explain what you found for others finding this thread
@normen said:
Would be cool if you could explain what you found for others finding this thread :)
Yeah... it really would be. But, it didn't work as expected :( I'm still at a loss... you have any suggestions??
I tried setting a Pass spitting the output out to a frame buffer and yadda yadda... before I remembered that the entire scene was just being re-rendered even though I wasn't using it >.<
@t0neg0d said:
Yeah... it really would be. But, it didn't work as expected :( I'm still at a loss... you have any suggestions??
Not really, since the data is in the GPU it has to be rendered somehow to get out of there. Its normally meant to be sent to a VGA or DVI port :)
@normen said:
Not really, since the data is in the GPU it has to be rendered somehow to get out of there. Its normally meant to be sent to a VGA or DVI port :)
Seeeeeew... 2d graphic manipulation of the scene buffer's output is the only option I take it?
@t0neg0d said:
Seeeeeew... 2d graphic manipulation of the scene buffer's output is the only option I take it?
Yeah, you might be able to get the depth buffer out too though, but idk about the details I'm afraid, I'd start with the render to memory stuff to check out where it gets its data from.
What do you want to do with the resultant data?
@Momoko_Fan said:
What do you want to do with the resultant data?
The more I think about it... the more ideas comes to mind. But originally, it was for the lens flare. I wanted to split the process into first generating a light map from the rendered scene (I could probably build the halo, ghosts and flare at this time as well) and then passing that into the filter as a separate texture. There are couple parts of the process that start using repetitive texture lookups that wouldn't be necessary if the lightmap was a texture already. There are probably other (more than likely better) ways of doing this... but I got sidetracked trying to figure how to do it (even if the "why to do it" is wrong).
EDIT: I should probably mention that the lightmap filters out every color under a certain threshold from the textured scene.
@t0neg0d said:
The more I think about it... the more ideas comes to mind. But originally, it was for the lens flare. I wanted to split the process into first generating a light map from the rendered scene (I could probably build the halo, ghosts and flare at this time as well) and then passing that into the filter as a separate texture. There are couple parts of the process that start using repetitive texture lookups that wouldn't be necessary if the lightmap was a texture already. There are probably other (more than likely better) ways of doing this... but I got sidetracked trying to figure how to do it (even if the "why to do it" is wrong).
EDIT: I should probably mention that the lightmap filters out every color under a certain threshold from the textured scene.
I thought that mechanically this is kind of what bloom did. Render the scene regular, render the scene as a glow map, combine the two in post processing. Or are you trying to render the flares not in post-proc?
@pspeed said:
I thought that mechanically this is kind of what bloom did. Render the scene regular, render the scene as a glow map, combine the two in post processing. Or are you trying to render the flares not in post-proc?
I'm trying to avoid rendering anything in the scene more than once but still use a shader to manipulate whats in the scene buffer. Kind of like using two post filters back to back.... accept the output of first one wouldn't be rendered to the scene buffer.
Thus far, I've only managed to a) overwrite the rendered scene or b) render the entire scene twice unnecessarily.
To be honest... the lens flare isn't why I am interested in figuring out how to do this, but I really can't take it much further without it. I've seen a couple articles now discuss splitting up processes that have nothing to do with geometries being rendered, but none of them discuss how. I'm guessing this must be a fairly common thing using GLSL and C++ or you think someone would mention the "how to do it".
@t0neg0d said:
I'm trying to avoid rendering anything in the scene more than once but still use a shader to manipulate whats in the scene buffer. Kind of like using two post filters back to back.... accept the output of first one wouldn't be rendered to the scene buffer.
Err, you mean like the FilterPostProcessor?
@normen said:
Err, you mean like the FilterPostProcessor?
Yep... But I picked that as a random example. I don't care if it is that or a Pass or ?? Just any way of getting the output that isn't written to the scene buffer as well. I don't mind rendering to a quad or whatever. I just couldn't figure out a way to bypass this.
I didn't see a way of setting the output buffer for the filter... I probably missed it.
The FilterPostProcessor allows multiple shaders to work on the same rendering, using only one quad.
@normen said:
The FilterPostProcessor allows multiple shaders to work on the same rendering, using only one quad.
How? where? why? when? lol... Can you give me a hint? ;)
@t0neg0d said:
How? where? why? when? lol... Can you give me a hint? ;)
I'd start out with looking at one of the filters that work with it ;) So basically all those you can add and edit in the SDK. They all use a FilterPostProcessor thats saved as a fpp file. Can't give you much more details as thats where I'd have to start too and I'm in bed already ;)
Argh… Ok. I finally figured out why my previous attempts to use Passes wasn’t working /sigh.
[java]lightMap = new Pass() {
@Override
public boolean requiresDepthAsTexture() {
return false;
}
@Override
public boolean requiresSceneAsTexture() { // I wasn’t doing this >.<
return true;
}
};[/java]
Anyways… in case someone else is trying to manipulate the rendered scene as a texture… here is an example:
[java]
@Override
public void initFilter(AssetManager manager, RenderManager renderManager, ViewPort vp, int w, int h) {
am = manager;
rm = renderManager;
this.vp = vp;
this.w = w;
this.h = h;
// This needs to be initialized…
postRenderPasses = new ArrayList<Pass>();
matLightMap = new Material(am, “MatDefs/LensFlareLightMap.j3md”);
matLightMap.setFloat(“Threshold”,0.85f);
Texture tex_LensDirt = manager.loadTexture(“Textures/lensdirt.png”);
tex_LensDirt.setMinFilter(MinFilter.NearestNoMipMaps);
tex_LensDirt.setMagFilter(MagFilter.Nearest);
tex_LensDirt.setWrap(WrapMode.Repeat);
// If you need the rendered scene do the following…
lightMap = new Pass() {
@Override
public boolean requiresDepthAsTexture() {
return false;
}
@Override
public boolean requiresSceneAsTexture() { // This is important!
return true;
}
};
lightMap.init(rm.getRenderer(), w, h, Format.RGBA8, Format.Depth, 1, matLightMap);
lightMap.getRenderedTexture().setMinFilter(Texture.MinFilter.BilinearNearestMipMap);
lightMap.getRenderedTexture().setMagFilter(Texture.MagFilter.Bilinear);
postRenderPasses.add(lightMap);
material = new Material(manager, “MatDefs/LensFlare.j3md”);
material.setInt(“Samples”, 6);
material.setTexture(“LensDirt”, tex_LensDirt);
material.setFloat(“Flare”, .18f);
material.setFloat(“Halo”, .42f);
// Add the output of the previous pass as a texture
material.setTexture(“LightMap”, lightMap.getRenderedTexture());
}[/java]