Offscreen Antialiasing

Hello,

I’m rendering a scene to an offscreen buffer and saving that to an image file, but having problems in setting anti-aliasing up for it.

This is the code I use to create the buffer and viewport:

[java]
renderBuffer = new FrameBuffer(width, height, samples);
renderBuffer.setDepthBuffer(Image.Format.Depth);
renderBuffer.setColorBuffer(Image.Format.RGBA8);

renderViewPort = new ViewPort(“SpriteRenderer”, renderCam);
renderViewPort.setClearFlags(true, true, true);
// renderViewPort.setBackgroundColor(new ColorRGBA(0, 0, 0, 0));
renderViewPort.setBackgroundColor(ColorRGBA.Green); // for testing only, end result should have transparent background
renderViewPort.setOutputFrameBuffer(renderBuffer);
[/java]

I’ve tried several values for the samples parameter in the FrameBuffer, but as soon as I set it higher than 1 I end up with an empty buffer (=> not even the scene’s back color, transparency only) in my scene processor’s postFrame method. It renders fine with samples set to 1 or 0, but obviously without the smooth edges.

I’ve also tried the FXAAFilter (with default parameters) instead, but that one doesn’t like transparent background color, which is essential for what I’m trying to do (render a 2d sprite from a 3d model). And even with that one I didn’t have any smoother edges on the rendered sprite.

For completeness sake, here’s the code I use to save the image (taken and adapted from ScreenshotAppState):
FileOutputStream out = new FileOutputStream(new File(“test.png”));
ByteBuffer outBuffer = BufferUtils.createByteBuffer(width * height * 4);

[java]
@Override
public void postFrame(FrameBuffer frameBuffer) {
try {
FileOutputStream out = new FileOutputStream(new File(“test.png”));
ByteBuffer outBuffer = BufferUtils.createByteBuffer(width * height * 4);

		int viewX = (int) (renderCam.getViewPortLeft() * renderCam.getWidth());
		int viewY = (int) (renderCam.getViewPortBottom() * renderCam.getHeight());
		int viewWidth = (int) ((renderCam.getViewPortRight() - renderCam.getViewPortLeft()) * renderCam.getWidth());
		int viewHeight = (int) ((renderCam.getViewPortTop() - renderCam.getViewPortBottom()) * renderCam.getHeight());

		renderer.setViewPort(0, 0, width, height);
		renderer.readFrameBuffer(frameBuffer, outBuffer);
		renderer.setViewPort(viewX, viewY, viewWidth, viewHeight);

		JmeSystem.writeImageFile(out, "png", outBuffer, width, height);

		out.close();
	} catch (java.io.IOException e) {
		e.printStackTrace();
	}
}

[/java]

Anyone knows what I’m doing wrong?

I am also interested in an answer to this problem.

The problem is not rendering the problem is reading back from the framebuffer.
From what I remember, you can’t read directly from a multisampled framebuffer before opengl3.1, once it’s been rendered you need to copy it to a single sampled framebuffer, then read from the latter.

see how it’s done in the FilterPostProcessor

1 Like

Okay, thanks, that seems to have solved the problem.

It’s a little weird that it didn’t work before as I’m running on OpenGL 4.4.0 according to the logs and the render caps do contain OpenGL31, but I simply forced the buffer copy now, which works perfectly fine for my use case.

…except that the buffer stays empty until renderer.renderViewPort (and thus postFrame) is called for the second time. Is there some magic happening in the background that I don’t understand and will have to work around or did I do something wrong?

Additionally to the code above, I also create a single sampled framebuffer now, my output buffer is still the multisampled one and in postFrame I do a call to copyFrameBuffer.

[java]
// … creating multisampled buffer during init, I tried multiple values for samples, 2, 4, 8, 16 …
renderBufferMS = new FrameBuffer(width, height, samples);
renderBufferMS.setDepthBuffer(Image.Format.Depth);
renderBufferMS.setColorBuffer(Image.Format.RGBA8);

// … creating singlesampled buffer during init …
renderBuffer = new FrameBuffer(width, height, 1);
renderBuffer.setDepthBuffer(Image.Format.Depth);
renderBuffer.setColorBuffer(Image.Format.RGBA8);

// … setting up viewport …
renderViewPort.setClearFlags(true, true, true);
renderViewPort.setBackgroundColor(new ColorRGBA(0, 0, 0, 0));
renderViewPort.setOutputFrameBuffer(renderBufferMS);

// … postFrame() …
renderer.copyFrameBuffer(renderBufferMS, renderBuffer);

// … then reading buffer into ByteBuffer to save as png afterwards, also in postFrame() …
renderer.readFrameBuffer(renderBuffer, outBuffer);
[/java]

Only happens when my renderBufferMS is indeed a multisampled buffer; with samples set to 1 the buffer gets filled on the first call.

This is the line that had to be double by the way for renderBuffer to not be empty, for anyone who runs into the same problem or would like to investigate.

[java]renderer.copyFrameBuffer(renderBufferMS, renderBuffer);[/java]

I looked at the code behind it but have no idea why it doesn’t work on the first call, but at this point I really don’t care if I have one or two of them anymore.

Just faced the same issue. Workaround I did is not to double copyFrameBuffer() line when I actually need to copy the buffer, but I wrote the next line in the simpleInitApp() just after I initialize my single sampled buffer:

renderer.copyFrameBuffer(null, screenFrameBuffer, true, true);

It does the trick but why we should do it?