Render To Texture : image data is null

hello,

I want to save a texture (render to texture) to a png, but the image data is always null although I can use the texture.getImage to set it to a niftygui image element, and it displays correctly

// texture.getImage().getData(0) is null at this point, but it still works
NiftyImage img = new NiftyImage(nifty.getRenderEngine(),new RenderImageJme(texture)); 

I generate the texture like this:

        Camera offCamera = new Camera(width,height);
        offView = renderManager.createPreView("Offscreen View", offCamera);
        offView.setClearFlags(true, true, true);
        offView.setBackgroundColor(ColorRGBA.DarkGray);

        // create offscreen framebuffer
        FrameBuffer offBuffer = new FrameBuffer(width,height, 1);

        //setup framebuffer's cam
        offCamera.setFrustumPerspective(45f, 1f, 1f, 1000f);
        offCamera.setLocation(new Vector3f(5f, 5f, 5f));
        offCamera.lookAt(new Vector3f(0f, 0.5f, 0f), Vector3f.UNIT_Y);

        //setup framebuffer's texture
        offTex = new Texture2D(width,height, Format.RGBA8);
        offTex.setMinFilter(Texture.MinFilter.Trilinear);
        offTex.setMagFilter(Texture.MagFilter.Bilinear);

        //setup framebuffer to use texture
        offBuffer.setDepthBuffer(Format.Depth);
        offBuffer.setColorTexture(offTex);
        
        //set viewport to render to offscreen framebuffer
        offView.setOutputFrameBuffer(offBuffer);
        
        offScreenNode= new Node("OffscreenNode");
        offView.attachScene(offScreenNode);

        offScreenNode.updateLogicalState(0.1f);
        offScreenNode.updateGeometricState();
        
        //offTex.getImage().setUpdateNeeded();
        offTex.getImage().getData(0);<<------------------------- returns null
        

am I missing something ?

thanks

if your texture is from a frame buffer the data is on the GPU, and null on the CPU.
You need to call the renderer.readFrameBuffer method to copy the data in your byteBuffer.

oh, ok …

I do this but I get a black image file

renderManager.getRenderer().setViewPort(0, 0, offBuffer.getWidth(),offBuffer.getHeight());
ByteBuffer outBuf = BufferUtils.createByteBuffer(offBuffer.getWidth()*offBuffer.getHeight()*4);
renderManager.getRenderer().readFrameBuffer(offBuffer,outBuf);

How do you visualize the resutling image?

in windows explorer
I checked the center pixel value of the outBuf, it is a 0 integer
so glReadPixels reads back black pixels in the bytebuffer…wtf…

if I give a null buffer to readFrameBuffer, it correctly reads the main game screen and the texture is correctly saved

so somehow the offBuffer is not taken into account when providing it

seems like preparing the offscreen renderer and saving the texture straight after does not work
but if I save the texture before releasing the offscreenrenderer(later on) then the framebuffer contains the actual data and the texture is correctly save

dont know why

Now you mention it I had a similar issue that I didn’t really sort out.
I worked around it by reading my frame buffer after the app did a few iterations on the render cycle (I waited like 10 frames). that’s a nasty workaround but I didn’t want to be sidetracked.
Maybe you hit the same kind of issue.
@Momoko_Fan any idea on this?

yep that sounds like it

Not sure what the problem is? When using external viewports, the data is only available after the viewport has been rendered.

well if you check the code above (comes from render to texture example)
is there a way to tell when the viewport has been rendered ? so I can be sure

Yeah, but where is this in relation to actually rendering the viewport?

well one setup an off screen framebuffer, camera, scene… and then it gets magicaly rendered somehow at some time in the background

so I need to know when the framebuffer has been rendered to, to save the picture

it works with the ScreenShotAppState(cos game has been running for some time), so it should work with offscreen framebuffer

http://javadoc.jmonkeyengine.org/com/jme3/app/state/AppState.html#postRender()

Edit: or if you need it sooner than that, I guess you can register your own scene processor with the viewport or something.

of course, create an appstate for it…thx, I still need to “think in JME” :slight_smile:

You can also use SceneProcessors, which is what they were intended for. In the postFrame callback you get the output framebuffer.

is there an example somewhere ?

Yes, ScreenshotAppState: