IllegalStateException: Framebuffer has erronous attachment

Hi.

I’m testing Skullstone on various hardware configurations. Now I have Dell Precision 5520 on my desktop. Skullstone runs when started from eclipse, everything is OK, despite the fact, that I had to force use of NVidia graphics, but it is case for another topic.

I have launcher written in c#, it uses DarkNotes library to run java through JNI, very nice solution, it works perfectly on my PC.
Running Skullstone with that launcher on Dell Precision causes problem I cannot solve. Things starts normally, loading and menu screens works well.
Starting the scene with dungeons (my custom materials, rendering pipeline) crashes the game with the following error in the console:

Feb 23, 2020 7:03:40 PM com.jme3.renderer.lwjgl.LwjglRenderer setFrameBuffer
SEVERE: === jMonkeyEngine FBO State ===
FrameBuffer[format=1600x900x1, drawBuf=mrt]
Depth => TextureTarget[format=Depth24]
Color(0) => TextureTarget[format=RGBA8]
Color(1) => TextureTarget[format=RGBA8]
Color(2) => TextureTarget[format=RGBA8]
Color(3) => TextureTarget[format=RGB8]
=== OpenGL FBO State ===
Context doublebuffered? false
FBO ID: 7
Is proper? true
Is bound to draw? true
Is bound to read? true
Draw buffer: GL_COLOR_ATTACHMENT0
Read buffer: GL_COLOR_ATTACHMENT0
== Renderbuffer Depth ==
RB ID: -1
Is proper? false
Type: Texture
== Renderbuffer Color0 ==
RB ID: -1
Is proper? false
Type: Texture
== Renderbuffer Color1 ==
RB ID: -1
Is proper? false
Type: Texture
== Renderbuffer Color2 ==
RB ID: -1
Is proper? false
Type: Texture
== Renderbuffer Color3 ==
RB ID: -1
Is proper? false
Type: Texture
Feb 23, 2020 7:03:40 PM com.jme3.app.Application handleError
SEVERE: Uncaught exception thrown in Thread[LWJGL Renderer Thread,5,main]
java.lang.IllegalStateException: Framebuffer has erronous attachment.
    at com.jme3.renderer.lwjgl.LwjglRenderer.checkFrameBufferError(LwjglRenderer.java:1323)
    at com.jme3.renderer.lwjgl.LwjglRenderer.setFrameBuffer(LwjglRenderer.java:1602)
    at com.dungeongame.gameclient.scene3d.c.c.postQueue(Unknown Source)
    at com.dungeongame.gameclient.scene3d.l.a(Unknown Source)
    at com.dungeongame.gameclient.scene3d.l.render(Unknown Source)
    at com.jme3.app.SimpleApplication.update(SimpleApplication.java:252)
    at com.jme3.system.lwjgl.LwjglAbstractDisplay.runLoop(LwjglAbstractDisplay.java:151)
    at com.jme3.system.lwjgl.LwjglDisplay.runLoop(LwjglDisplay.java:185)
    at com.jme3.system.lwjgl.LwjglAbstractDisplay.run(LwjglAbstractDisplay.java:228)
    at java.lang.Thread.run(Unknown Source)

Do you have any idea?

1 Like

maybe that Color 3 is RGB8 while others are RGBA8?

And if ^ is not the case, try running with GraphicsDebug AppSettings or with RenderDoc, maybe you get to know the actual glError.

I don’t see such options in JME 3.0 (can’t upgrade to current version)

So far I found that if I build my launcher for .NET Framework 4.0 the problem occurs, building for 4.6+ solves the problem.

Using the hack described here, which is necessary to run on laptops with dynamically switched GPUs, the problem appears again.
There is nothing wrong with my framebuffer configuration, it works on every machine when called directly from java. Running the code through c# results in error, but only on few machines. I need to understand what that error means to solve it.

Since this seems related to forcing the GPU, I have to ask if the GPU output was correct earlier in the output… or if it’s once again selecting the wrong GPU. I’m sure you checked that but I have to ask.

Are you sure? It wasn’t exposed, but setBoolean("GraphicsDebug", true); should work, but I can’t tell you if that is after 3.0.

But one thing that this sounds to me is Nvidia and Environment Variables like FORCE_OPTIMUS or what that was, i.e. the Card isn’t in High Performance mode and/or the internal CPU is being used.

I guess for some reason it detects java like that, but you are realldy hiding java by not even invoking java.exe but using JNI.
Care to elaborate? do you fear players complaining about that?

No, it is not related this way.
First version of my launcher didn’t forced GPU, it was regular c# program built against .NET 4.0. Everything was fine on PCs, but on my laptop the error appeared. I solved it changing target platform to 4.6 + AnyCPU with 32 bit preferred. I have absolutely no idea why it worked.

Then I started to add new things to my launcher and now the problem is present again even on .net 4.6.

Well, I had to ask because of these facts:

  1. it seems to only happen on a machine where forcing the GPU is required.
  2. it’s the kind of error that would happen if there was something wrong with the GPU setup.

You are an experienced developer, so I’m sure you already confirmed that the GPU info that JME reports at startup was IDENTICAL in the working and non-working cases… but I had to ask. And I’m sure it would make some folks feel a little better to see it.

…and maybe that version info triggers some other memory in someone.

Yes

Handling NVidia is pretty simple, all I need to do is to call

[DllImport("nvapi.dll")] 
public static extern int NvAPI_Initialize();

This is equivalent of

extern "C" 
{
  __declspec(dllexport) unsigned long NvOptimusEnablement = 0x00000001;
}

The problem is with ATI cards, the only way to inform the driver is to use dllexport, which is not possible in c# without a hack. This hack, however, causes the problem with framebuffer, it decompiles the application, make a change in the IL code and compiles it again. The final exe throws the error.
I simply don’t understand the connection between .NET target platform and that error on OpenGL level.

So did you test it by using RGBA8 instead of RGB and the problem still persisted Or do you just state the above without experimental evidence?

Short answer: Your framebuffer configuration is not required to be supported by OpenGL spec.

Why?
Here’s the spec:
https://www.khronos.org/opengl/wiki/Framebuffer_Object#Completeness_Rules

OpenGL allows implementations to state that
they do not support some combination of image
formats for the attached images; they do this
by returning GL_FRAMEBUFFER_UNSUPPORTED when
you attempt to use an unsupported format combination.

However, the OpenGL specification also requires
that implementations support certain format
combinations; if you use these, implementations
are forbidden to return GL_FRAMEBUFFER_UNSUPPORTED.
Implementations must allow any combination
of color formats, so long as all of those color
formats come from the required set of color formats.

So which are the required color format?

https://www.khronos.org/opengl/wiki/Image_Format#Required_formats

These formats are required for both textures
 and renderbuffers. Any of the combinations
 presented in each row is a required format. 

Base format 	Data type 	Bitdepth per component
RGBA, RG, RED 	unsigned normalized 	8, 16
RGBA, RG, RED 	float 	16, 32
RGBA, RG, RED 	signed integral 	8, 16, 32
RGBA, RG, RED 	unsigned integral 	8, 16, 32 

Format RGB is not within the required format. The spec states

Texture only

These formats must be supported for textures.
 They may be supported for renderbuffers,
 but the OpenGL specification does not require it. 
Base format 	Data type 	Bitdepth per component
RGB 	unsigned normalized 	8, 16
RGBA, RGB, RG, RED 	signed normalized 	8, 16
RGB 	float 	16, 32
RGB 	signed integral 	8, 16, 32
RGB 	unsigned integral 	8, 16, 32
RG, RED 	unsigned integral 	Compressed with RGTC 

Both RGB8 and RGBA8 renderbuffers works on this machine, I’m able to play the game when I run it from eclipe/java.exe. As you can see in the log, even RGBA8 renderbuffers are marked as invalid.
I’ll test it with RGBA, but as I said, the RGB renderbuffer is working on this machine (and on every machine I ever tested my game before).

It lies somewhere in memory management of .NET platform, because problem occurs only when I run the game via launcher written in c#. Decreasing Java’s Xmx parameter allowed me to run the game via c#. Strange. As far as I know .net can alloc up to 2GB of memory and the game uses half of it.

Well, in any case, I do not recommend using ‘RGB8’ as framebuffer attachment as it is not required to be supported by OpenGL spec.

Is there a way to determine if it is supported on particular machine? I’m using RGB because it decreases the size of framebuffer and, in theory, should improve performance.

EDIT: I mean something better than brutally enforcing RGB8 and catching exceptions.

I am currently not aware of a way to ask the gl if it supports RGB8 other then glCheckFramebufferStatus which jME calls internally and throws the above exceptions if it fails.
So I suppose catching the exception or manually calling the above method is one way to go.

If you’re curious if you are actually saving anything on a specific system, there is a way to check:
glGetInternalFormativ (OpenGL 4.3 or ext)

https://www.khronos.org/opengl/wiki/Image_Format#Image_format_queries

GL_INTERNALFORMAT_PREFERRED
    As previously stated, OpenGL is allowed
 to replace your given image format
 with a different one.
 If you use GL_RGB8, OpenGL can
 promote it to GL_RGBA8 internally,
 with the implementation filling in a 1.0
 for the alpha. By querying this, you can detect
 when such image format modification will happen.
 This will return a single value, which is the
 OpenGL image format enumerator that will be used internally by the implementation. If it's the same as the one you passed,
 then no promotion is done.
1 Like