Framebuffer object format not supported

I have sent my application (jme3) for a friend to test and unfortunately, he is not able to run it. He is getting “java.lang.IllegalStateException: Framebuffer object format is unsupported by the video hardware.”. Now, there is only limited amout of debugging I can do remotely on his machine (even deploying new version of code is a pain), so I thought that maybe you will now the problem out of the box ? He is having very reasonable GForce card, so I don’t think this is a problem. Important parts of the log file




2011-01-18 00:49:31 com.jme3.system.lwjgl.LwjglAbstractDisplay run
INFO: Using LWJGL 2.5
2011-01-18 00:49:31 com.jme3.system.lwjgl.LwjglDisplay createContext
INFO: Selected display mode: 1024 x 768 x 0 @0Hz
2011-01-18 00:49:32 com.jme3.system.lwjgl.LwjglAbstractDisplay initInThread
INFO: Display created.
2011-01-18 00:49:32 com.jme3.system.lwjgl.LwjglAbstractDisplay initInThread
INFO: Adapter: nv4_disp
2011-01-18 00:49:32 com.jme3.system.lwjgl.LwjglAbstractDisplay initInThread
INFO: Driver Version: 6.14.11.9745
2011-01-18 00:49:32 com.jme3.system.lwjgl.LwjglAbstractDisplay initInThread
INFO: Vendor: NVIDIA Corporation
2011-01-18 00:49:32 com.jme3.system.lwjgl.LwjglAbstractDisplay initInThread
INFO: OpenGL Version: 3.2.0
2011-01-18 00:49:32 com.jme3.system.lwjgl.LwjglAbstractDisplay initInThread
INFO: Renderer: GeForce 9800 GT/PCI/SSE2/3DNOW!
2011-01-18 00:49:32 com.jme3.system.lwjgl.LwjglAbstractDisplay initInThread
INFO: GLSL Ver: 1.50 NVIDIA via Cg compiler
2011-01-18 00:49:32 com.jme3.system.lwjgl.LwjglTimer
INFO: Timer resolution: 1000 ticks per second
2011-01-18 00:49:32 com.jme3.renderer.lwjgl.LwjglRenderer initialize
INFO: Caps: [FrameBuffer, FrameBufferMRT, FrameBufferMultisample, TextureMultisample, OpenGL20, OpenGL21, OpenGL30, OpenGL31, OpenGL32, ARBprogram, GLSL100, GLSL110, GLSL120, GLSL130, GLSL140, GLSL150, VertexTextureFetch, TextureArray, TextureBuffer, FloatTexture, FloatColorBuffer, FloatDepthBuffer, PackedFloatTexture, SharedExponentTexture, PackedFloatColorBuffer,TextureCompressionLATC, MeshInstancing, VertexBufferArray]

[...]

2011-01-18 00:49:37 com.jme3.renderer.lwjgl.LwjglRenderer setFrameBuffer
SEVERE: Problem FBO:
FrameBuffer[format=1024x768x1, drawBuf=0]
Depth => BufferTarget[format=Depth]
Color(0) => TextureTarget[format=RGBA8]

2011-01-18 00:49:37 com.jme3.app.Application handleError
SEVERE: Uncaught exception thrown in Thread[LWJGL Renderer Thread,5,main]
java.lang.IllegalStateException: Framebuffer object format is unsupported by the video hardware.
at com.jme3.renderer.lwjgl.LwjglRenderer.checkFrameBufferError(LwjglRenderer.java:1112)
at com.jme3.renderer.lwjgl.LwjglRenderer.setFrameBuffer(LwjglRenderer.java:1363)
at com.jme3.renderer.RenderManager.renderViewPort(RenderManager.java:716)
at com.jme3.renderer.RenderManager.render(RenderManager.java:757)
at abies.vmat.Vmat.update(Vmat.java:252)
at com.jme3.system.lwjgl.LwjglAbstractDisplay.runLoop(LwjglAbstractDisplay.java:144)
at com.jme3.system.lwjgl.LwjglDisplay.runLoop(LwjglDisplay.java:141)
at com.jme3.system.lwjgl.LwjglAbstractDisplay.run(LwjglAbstractDisplay.java:203)
at java.lang.Thread.run(Unknown Source)


What is "format=1024x768x1" ? Is it one bit? one byte? Can it be caused by PSSM shadow filter or bloom filter ?

Vmat.update is same as SimpleApplication.update, I have copied this method 1-to-1 (to have my own 'SimpleApplication' class, without FPS camera and gui stats).

It#s the depth buffer, that one is only black and white.

I either suspect a driver propblem, or some technical features used his gc does not support for some reason. But more likely the first

I think there is a problem with filters and depth format i need to look more into this, i think all those “Framebuffer object format is unsupported by the video hardware.” may be due to different depth format between textures and FBO.

I think this happens when the graphics card doesn’t support textures with other sizes than power of two?

Could be - but GeForce 9800 sounds like it should support most of the things ever invented… Plus, in case of bloom filter (which I suppose might be a problem here, will try to confirm tonight), it can be tricky to get power-of-two size of texture, as it is initialized to be screen size.

No that’s not a size issue, no Filter has a power of 2 texture size for the frame buffer, it’s usually the size of the output on the screen

ie 640x480.

No the issue i was thinking about is that the frame buffer has a depth format set to Format.Depth but the the associated texture is Format.Depth24.



Now that i’m writing it i realize that the bloom filter does not use the depth buffer…so it can’t be it…



Do you have another filter in the stack? like SSAO, or water, fog, lightScattering?

I could swear I have seen .Depth used in bloom filter when I checked it today morning.



I have PSSM shadow added, with 2048 size specified (but maybe it is creating another texture at the very end ?)



I have seen something on the net about nvidia not liking depth24 textures and that depth24stencil8 should be used instead - don’t know if it is related.

yeah the .Depth you saw is because when you create a frame buffer you have to specify the depth format of the depth buffer.

But when using bloom depth is not rendered as a texture, so the texture is not even instantiated.



About the depth24 vs Depth24stencil8 on nvidia i’m gonna look at it…but it can’t be the issue with bloom.



maybe what you should test is a version with pssm only, a version with bloom only, and a version with none of these to corner out the issue.

I had the same error with the Monkeys at the Beach test. Once I updated my drivers, it fixed the problem.

:stuck_out_tongue: could worth a try indeed



@abies, could you ask your friend to updates his drivers please? :smiley:

I will first try to pinpoint the issue (so we know what behaves wrongly with old drivers) and then try everything on with new drivers. Hopefully it will help.



Not sure if I will be able to do it today, as he might be quite busy - but I will keep you updated about the results.

We have done the test on his laptop, with intel graphic card… Good news - no crashes. Bad news - my shaders are not working (unmodified lighting.j3md seems to work at least basically, so my fault probably), pssm shadows quite broken, bloom not working, basic shadows are very good, rest of the program works ok.



I hope we will be able to test nvidia pc again soon.

Well intel laptop graphic cards are known to barely supports ogl2.0, so maybe some shaders won’t work indeed.

you can make some shaders optional (pssm or bloom can be disabled for low configurations).



Btw are you gonna release your prototype? I’d gladly test it too, it looks really promising :smiley:

Ok, so the status is:

SOLVED :wink:



Despite his claims, it turned out that my friend has outdated graphics drivers (over 1 year old). After updating to latest ones, problem dissappeared. This is suprising to me, why 1 year old drivers are crashing on the perfectly ok opengl code, not using any of the latest super-duper-geometry-shader things, but that’s another issue. I was not able to test excactly what was causing the problem, as updated drivers worked, but it was most probably PSSM shadows (as the version without shadows and bloom worked a week earlier).



On intel front, I have managed to get most things working by simplifying the shaders a bit (seems that anything using spherical coordinates is not working for some reason). Performance is not great, bloom doesn’t work, but all other things are visible.



There is still small problem that on both nvdia and intel, normal ‘lighting’ shader is behaving strange, making darker areas of models completely black instead of just dark, while it works perfectly ok on my ATI - but this is separate thing.



So you were right from very start it was a driver issue. Thanks for help.

There is still small problem that on both nvdia and intel, normal ‘lighting’ shader is behaving strange, making darker areas of models completely black instead of just dark, while it works perfectly ok on my ATI – but this is separate thing.



Thats s diffuse stuff I have this as well, only on intel and nvidia it looks strange on ati it does not

For me it solved by increasing shininess material parameter



super-duper-geometry-shader things

Nope not geometry shaders yet, but with everything else jme3 is nearly top edge of technology aviable so don’t underestimate the requirements for it.

Thanks - it was indeed shininess parameter. I have forgotten to set it at all, so it was probably zero… With it set to bigger value (1 or 26 depending on model), no more strange dark spots on models.