[solved] Texture.RTT_SOURCE_DEPTH fails on Radeon gfx card

I tried to get depth buffer using

Texture.setRTTSource(Texture.RTT_SOURCE_DEPTH);



About 10 or more machines are tested, only two machine fails.



org.lwjgl.opengl.OpenGLException: Invalid framebuffer operation (1286)

at org.lwjgl.opengl.Util.checkGLError(Util.java:53)

at org.lwjgl.opengl.Display.swapBuffers(Display.java:591)

at org.lwjgl.opengl.Display.update(Display.java:609)

at com.jme.renderer.lwjgl.LWJGLRenderer.displayBackBuffer(LWJGLRenderer.java:518)

at com.jme.app.BaseGame.start(BaseGame.java:85)

at jmetest.renderer.TestRenderToTexture.main(TestRenderToTexture.java:79)





Accidently the two machines have something in common.

It is the ATI Radeon gfx card.

All the other machines succeeding in the test run on Geforce gfx card.



It is tested by appending the code below to TestRenderToTexture.java

fakeTex.setRTTSource(Texture.RTT_SOURCE_DEPTH); (Line 173)



gfx card which failed the test.

  • ATI Radeon HD 3850 (new gfx card)
  • ATI Radeon X800 GTO (old gfx card)



    I appreciate for any help.  :wink:

Coincidentally, I'm working on a FrameBuffer RTT bug fix in my free time for a similar issue (also showing up on ATI cards.)  I'm not sure if it will fix this problem, but the changes should be in in the next few days.

It might not be a jME bug, download some demo online on shadow mapping and test.

E.g: http://www.codesampler.com/oglsrc/oglsrc_8.htm

Are you sure?



There stands "nVIDIA specific features"!  :expressionless:

Ok, I've just checked into CVS some changes to FBO operation.  They are meant to fix GL_FRAMEBUFFER_INCOMPLETE_DRAW_BUFFER_EXT, but perhaps it will help you too.

It works perfectly!!! Impressive!!!

Thanks a lot  :slight_smile:

Nice to hear! :slight_smile: