FrameBuffer Help

Hello,
I was trying to get a framebuffer to render a quad to a texture but the texture appears to be completely black for some reason…
I am new to this so is there anything obviously wrong that I am doing?
Code:
[java]
public void simpleInitApp() {

	Material mat = new Material(assetManager, "Common/MatDefs/Misc/Unshaded.j3md");
	mat.setColor("Color", new ColorRGBA(1,0,0,1));
	drawQuad(mat);
	ByteBuffer data = BufferUtils.createByteBuffer( (int) ((int)Math.ceil(Format.RGBA16F.getBitsPerPixel() / 8.0) * 256 * 64));
	Image img = new Image();
	img.setHeight(64);
	img.setWidth(256);
	img.setData(data);
	img.setFormat(Format.RGBA16F);
	Texture2D tex = new Texture2D(img);
	FrameBuffer fb = new FrameBuffer(256,64,1);
	fb.addColorTexture(tex);
	viewPort.setOutputFrameBuffer(fb); 
	drawQuad(mat);

	
		
	
	
}

public void drawQuad(Material mat){
    
	 Vector2f dl = new   Vector2f(-1.0f, -1.0f);
	 Vector2f dr = new  Vector2f(1.0f, -1.0f);
	 Vector2f ul = new   Vector2f(-1.0f, 1.0f);
	 Vector2f ur = new   Vector2f(1.0f, 1.0f);
	 
	 Vector2f[] vertices = new Vector2f[]{dl,dr,ul,ur};
	 
	 int[] index = new int[]{0,1,2, 1,3,2};
	 
	 Mesh m = new Mesh();
	 
	 m.setBuffer(Type.Position, 2, BufferUtils.createFloatBuffer(vertices));
	 m.setBuffer(Type.Index, 2, BufferUtils.createIntBuffer(index));
	 
	 Geometry geo = new Geometry("Mesh", m);
	 geo.setMaterial(mat);
	 
	 rootNode.attachChild(geo);
}

}
[/java]

Thanks for any help!

1 Like

What are you doing with the output of the frame buffer? Are you trying to rendered it to file? to screen?

if you’re end goal is to render to screen, you need to create another offscreen viewport to render the texture. As it stands you switch the main vp’s framebuffer and this would result in a black screen.

I am currently not doing anything with it… I am just using ImageRaster to print all of the pixels…

@joshngu9 said: I am currently not doing anything with it... I am just using ImageRaster to print all of the pixels...

I think she meant “What is your ultimate intent?” Maybe we can point you to something similar or closer to what you want that already works. I assume you’ve looked at the render to texture examples and stuff but I’m also not sure what you are trying to accomplish… so maybe those are irrelevant.

<cite>@pspeed said:</cite> I think she meant "What is your ultimate intent?" Maybe we can point you to something similar or closer to what you want that already works. I assume you've looked at the render to texture examples and stuff but I'm also not sure what you are trying to accomplish... so maybe those are irrelevant.
Ah, well ultimately I will be using these textures as lookup tables so they will be passed to shaders.

Take a look at the ImagePainter plugin and you can see how that works. I don’t do anything with FrameBuffers, I just modify the Image and add it to a Material.

<cite>@zarch said:</cite> Take a look at the ImagePainter plugin and you can see how that works. I don't do anything with FrameBuffers, I just modify the Image and add it to a Material.
These textures are actually for atmospheric scattering and since the calculations are pretty intense it is much faster to do them in a pixel shader...

Ahh, I was confused by your reference to ImageRaster.

ImageRaster looks at the Image in the CPU, I’m not sure changes made on the GPU will be picked up by it.

Try putting the Image as the texture on a cube or something and see what it looks like.

<cite>@zarch said:</cite> Ahh, I was confused by your reference to ImageRaster.

ImageRaster looks at the Image in the CPU, I’m not sure changes made on the GPU will be picked up by it.

Try putting the Image as the texture on a cube or something and see what it looks like.


Hmm after trying that it appears that the texture is still empty (black)…

Any other suggestions? I’d really like to figure this out…it would open so many doors.

Sorry about the English by the way

I’m still not sure what you are ultimately trying to achieve. If you are trying to render something to a texture have you looked at the examples for rendering to a texture. If you are trying to do something else then please explain in detail what that is because I still don’t have a clue. It’s also clear from the other answers that others are unsure also.

<cite>@pspeed said:</cite> I'm still not sure what you are ultimately trying to achieve. If you are trying to render something to a texture have you looked at the examples for rendering to a texture. If you are trying to do something else then please explain in detail what that is because I still don't have a clue. It's also clear from the other answers that others are unsure also.

I am trying to convert a large portion of Eric Bruneton’s Pre-Computed Atmospheric Scattering from c++/openGL to jmonkey. The main way in which it works is by generating look up textures first using frame buffers.
So for example I am trying to take this code:
[java]

glFramebufferTextureEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, transmittanceTexture, 0);
glViewport(0, 0, TRANSMITTANCE_W, TRANSMITTANCE_H);
glUseProgram(transmittanceProg);
drawQuad();
[/java]

and put it in it’s jmonkey equivalent. To start this out I wanted to get a feel for frame buffers (which is the code seen in the first post). So I create a viewport, created a new node (not seen above), initialize a frame buffer, create a texture and set that texture as the frame buffer’s output, set the viewport’s frame buffer to the one I created, then draw a quad with a certain material. To test that this was working I was writing out each pixel but now I am setting a triangle’s texture to the one that the quad should be drawn to…and still it appears to not be working… Sorry about any unclarity.

And I have looked into the example code.

I really appreciate the help.

Does the render to texture example work for you?

Atmospheric Scattering will more than likely end up in Filter… you can pre-render whatever you need in the postQueue method, just remember to grab a reference to the RenderManager in the initFilter method.

However, I’m a little lost on what you are pre-rendering. Are the images already created? If they are (which I am assuming is the case), you just need to pass these into the composite shader (or the shader last used in your Filter)

From the look of the GL calls above, they are using the rendered scene (from a frame buffer) and passing in the transmitenceTexture (sp?) for use in the light scattering process.

EDIT: Hmmm… actually, it looks like the transmitenceTexture is a pre-rendered portion of the scene… I am guessing? Unless you can be a bit more specific about what it is you need to render for the final output, it is going to be almost impossible to point you in the right direction. All I do know, is this will end up being a Filter, so you may want to take a look at a few of the JME stock Filters to see how the setup/use FrameBuffers for both Passes and forced rendering.

EDIT 2: A quick and dirty example of forced rendering:
[java]
@Override
protected void postQueue(RenderQueue renderQueue) {
Renderer r = rm.getRenderer();
// Set the frame buffer to something other than the scene output fb
r.setFrameBuffer(ghostBuffer);
// Clear buffers
r.clearBuffers(true, true, true);
// Set a forced material (prolly don’t need this)
rm.setForcedMaterial(ghostMat);
// In this case I am rendering a geometrylist
rm.renderGeometryList(geoms);
// Clear the forced material
rm.setForcedMaterial(null);
// Reset the frame buffer back to the original
r.setFrameBuffer(vp.getOutputFrameBuffer());
}
[/java]

This is later used in the composite frag shader

1 Like
<cite>@t0neg0d said:</cite> Atmospheric Scattering will more than likely end up in Filter... you can pre-render whatever you need in the postQueue method, just remember to grab a reference to the RenderManager in the initFilter method.

However, I’m a little lost on what you are pre-rendering. Are the images already created? If they are (which I am assuming is the case), you just need to pass these into the composite shader (or the shader last used in your Filter)

From the look of the GL calls above, they are using the rendered scene (from a frame buffer) and passing in the transmitenceTexture (sp?) for use in the light scattering process.

EDIT: Hmmm… actually, it looks like the transmitenceTexture is a pre-rendered portion of the scene… I am guessing? Unless you can be a bit more specific about what it is you need to render for the final output, it is going to be almost impossible to point you in the right direction. All I do know, is this will end up being a Filter, so you may want to take a look at a few of the JME stock Filters to see how the setup/use FrameBuffers for both Passes and forced rendering.

EDIT 2: A quick and dirty example of forced rendering:
[java]
@Override
protected void postQueue(RenderQueue renderQueue) {
Renderer r = rm.getRenderer();
// Set the frame buffer to something other than the scene output fb
r.setFrameBuffer(ghostBuffer);
// Clear buffers
r.clearBuffers(true, true, true);
// Set a forced material (prolly don’t need this)
rm.setForcedMaterial(ghostMat);
// In this case I am rendering a geometrylist
rm.renderGeometryList(geoms);
// Clear the forced material
rm.setForcedMaterial(null);
// Reset the frame buffer back to the original
r.setFrameBuffer(vp.getOutputFrameBuffer());
}
[/java]

This is later used in the composite frag shader

Hi,
The images are not already created. A full-screen quad is created and the colors of this quad are set in the pixel shader using the atmospheric scattering calculations…it is then rendered to a texture. The final output of the texture is simply a table of values that are then looked up when applying the scattering. For example, this is the transmittance texture after it has been created in the shader:

For more reference here are the actual shaders that render that:

[java]
// vertex shader:
void main() {
gl_Position = gl_Vertex;
}

// fragment shader:

float opticalDepth(float H, float r, float mu) {
float result = 0.0;
float dx = limit(r, mu) / float(TRANSMITTANCE_INTEGRAL_SAMPLES);
float xi = 0.0;
float yi = exp(-(r - Rg) / H);
for (int i = 1; i <= TRANSMITTANCE_INTEGRAL_SAMPLES; ++i) {
float xj = float(i) * dx;
float yj = exp(-(sqrt(r * r + xj * xj + 2.0 * xj * r * mu) - Rg) / H);
result += (yi + yj) / 2.0 * dx;
xi = xj;
yi = yj;
}
return mu < -sqrt(1.0 - (Rg / r) * (Rg / r)) ? 1e9 : result;
}

void main() {
float r, muS;
getTransmittanceRMu(r, muS);
vec3 depth = betaR * opticalDepth(HR, r, muS) + betaMEx * opticalDepth(HM, r, muS);
gl_FragColor = vec4(exp(-depth), 0.0); // Eq (5)
}

[/java]

I will look into the source for the filters, thanks for the help.

One thing I see that may be an issue is that your quad appears to be in device normalized coordinates yet is being drawn in world coordinates and is not scaling to the viewport… I’m not sure how you would use DNCs in jmonkey correctly though…

@joshngu9 what’ you’re trying to do looks a lot like a filter. Why don’t you use one?

<cite>@nehon said:</cite> @joshngu9 what' you're trying to do looks a lot like a filter. Why don't you use one?
I was trying to keep to the original code as much as possible. I have been going off of this code: http://code.google.com/p/jmonkeyengine/source/browse/trunk/engine/src/test/jme3test/post/TestRenderToTexture.java which doesn't appear to use a filter.
<cite>@nehon said:</cite> @joshngu9 what' you're trying to do looks a lot like a filter. Why don't you use one?

Sorry to bring this back up but I had one more question: How would I go about creating a quad that filled the entire view port? Ideally is there a way to do this without having to adjust the camera position to ensure that the quad is completely in view? Thanks.

really…you don’t want to use a filter? you’re kind of reinventing the wheel here…
Anyway look an the renderProcess method is the FilterPostProcessor class.
Then look at the post.vert that projects the quad to full screeen…