JMonkey Detect the Wrong OpenGl Version

Hello, it’s been a long time since i actively developed on Jmonkey. Sadly when i updated the Jmonkey version i got it with a couple of issues.

I got around most of them, but there is an Issue with the detection of the OpenGL Version. I do not know why but the wrapper calling OpenGL api seem to be returning the wrong version for Jmonkey on my PC. Version 3.2.0 of OpenGL when i have version 4.6.0 .

I look in debug mode inside of the GLRender.java and added code into GL to get the version the proper way and i still get the wrong version through the normal way. You probably wont like what i say but using glGetString is old and shouldn’t be use anymore. glGetInterger(GL_MAJOR_VERSION) and GL_MINOR_VERSION should be use. No one need the gl 1.1 anymore and should be in a if for people who need it…

Anyway back to the point, After the change to the version, i still ended up getting the wrong version. 3.2.0 and i am 100% positive my hardware support it. I went and force the version by hand in debug and got it running as intended with version 460. (Test were done with Tesselation, since my Shader are all broken without the version 400 or more)

But this doesn’t change the point, why is the Bridge between Jmonkey and OpenGL failling?
Then i saw that when the app start it show the proper GLSL Support but the wrong API version support…
image

The code should probably check the Support for GLSL language instead of the OpenGL Version… Since AMD seem to report the wrong number here.I know this cause of the line OpenGL Version: 3.2.0 Core Profile Context (AMD Driver version) is the amd driver reporting itself and my amd driver is 24.3.1.240216.

So now i am wondering if anyone had this issue in the pass and if we have a way to validate using GLSL Version instead of the OpenGL Version? Or maybe both?

Thank you for the support
See image below for the OpenGL 4.6 support

Is that the AppSettings renderer version? That you can set yourself. And is there a difference if you use LWJGL 2 vs 3?

Hi Tonihele,

I use the base Rendersetting from the JME Test when running the Core engine downloaded from github, version 3.7.0. The issue is present in 3.6.1.

I will try to run the code on LwjgL 2 to see if i can gather more info on the subject.

Edit


I change to LWJGL instead of LWJGL3 in maven. I do not know how to change to LWJGL2 without having to change Jmonkey Version back.

The same issue.

So yes, looks like a driver issue then. I can’t say for the technical part you proposed.

Except this one seems to be in OpenGL 3. Which I think is not quite old enough. On Linux many drivers report OpenGL 2 as max. And I would be surprised if it yielded another result. Of course one could try catch this but I’m unsure what would it help.

I have proof that AMD change how they write the Driver version for OpenGL

This other game as issue too and show the exact same behavior.

I would highly suggest a change (Which i can do a PR myself) to read directly the OpenGL Language support instead of the Open GL Driver version.

I do see the WebGL thing. It would need to be tested to see if we can read it this way for the WebGL too.

1 Like

About this, I do agree it s not require, since the probleme is still there with the more up to date technique anyway.

I suggest these lines in GLRenderer.java should be swap


With This line
int oglVer = extractVersion(gl.glGetString(GL.GL_SHADING_LANGUAGE_VERSION));

My understand is that shading language version does not necessarily 100% correlate to openGL version. And definitely not in a ogl version = shader version sort of way.

…and beyond some specific edge cases, it feels uncomfortable hard coding that swap. After all, there must be a reason there are two different versions.

I know you’ve been part of this project for a long time, do you have any idea how this issue can be fix the proper way? Definitely other Engine have a way to get the proper OpenGL Version. But to me the GLSL Shader Language Version seem more appropriate to know the capabilities of the GPUs. The only issue i see, might be the WebGL and Phone support. For Linux i don’t think this would be an issue. I can always run some test on my Linux Laptop to see if the Shader Language match the OpenGL Version.

I could always try to contact AMD for this and see why it was change to a different value.

You don’t have to go back very far to see cases where ogl version != shader version. That’s pretty recent.

https://www.khronos.org/opengl/wiki/History_of_OpenGL

pls check this thread https://hub.jmonkeyengine.org/t/shadow-maybe-failed-in-gles2-and-appsetting-failed-to-choose-gl-version/46759?u=massdrive

I Would like to add more information on my System.

I am still unable to figure out how to fix this. I will change my graphic driver and see if it repport properly. But AMD Software report the proper version, GPU-Z too, would that mean it’s LWJGL which report the wrong information?

Edit: I Tested to Downgrade 2 major driver version and each version say 3.2.0… So now i m back at wondering where the error come from.

Edit2: After using an external tool i can confirm again that the issue seem to come from the JME Renderer/LWJGL fonctional call.
I use a tool call OpenGL Extensions Viewer and the version detected is 4.6(Major, Minor) and 4.6.0 Core profile Context. (Which is the glGetString(GL_VERSION))

The issue appeared between version 3.5.2 and 3.6 and up , I did try to change Jmonkey 3.7 to LWJGL version 3.2.3 to match the state of version 3.5.2 and the issue is still present. I do not know what cause the issue and my current knowledge of the Render Pipeline is a bit weak to figure out who changed what that might have cause an issue at that level. To me GL interface was a direct call to LWJGL interface in C to OpenGL

2 Likes

If youre able to do a git bisect between 3.5.2 and 3.6 that should allow you to find the exact commit that introduced the change

Found the error, or what cause the whole issue

I Dont know what was the point. There is a piece of code that Check for OpenGL2 and upgraded it to Whatever people pc supported in the pass.

For anyone with this issue, here is the fix.
settings.setRenderer(AppSettings.LWJGL_OPENGL45);

Here in the Context creation we can see an override of the value if it was equal to OPENGL2

1 Like

@RiccardoBlb Should the change be reverted?

So, the issue was that the engine was starting with gl 3.2 instead of the highest supported api?

If that’s the case, it is expected, since as you noted the engine sets 3.2 as default.

The reason for that is that the engine without this setting doesn’t automatically use the highest version (that would be extremely unreliable since opengl is not backward compatible), instead it defaulted to compatibility profile.

The problem with compatibility profile is that it is often outdated or behaves poorly, so we decided to default to the minimum viable core profile that was widely supported and had all the useful features, so opengl 3.2. If you need an higher version you need to set it as you did.

So no, the change shouldn’t be reverted, maybe documented somewhere.

Shading language is set in the matdef.

4 Likes

You did the opposite, the engine use to start with OpenGL2 and it was change to OpenGL3.2 But in the code it is hardcoded that if OpenGL2 is found it would try to request a Better Version by itself.

This is completetly false btw. I spent 2 weeks looking for a breaking change which had no documentation about it.
My Shader support only Tessellation and it is only available in version 4.0+. Also how come no one could tell what was the issue?

Probably because it’s a recent change and the most active members of the forum don’t use 4.0+ specific features.

I would like to suggest the usage of OpenGL 4.6 by default. My Old laptop from 2012 have support for it. And i believe it would just be better to have people who want to support Old hardware to set the version lower if needed.

I am a bit mad for my time wasted since this could’ve been written in the Log when you start your app. “(Default) OpenGL Version :3.2” and could have a message (To Change the default version…)
The only error i kept having was a Shader version not supported. And online it all pointed to my GPU Drivers, Super misleading. If this message could be change too, that would be good. Like adding, This could be cause by the default OpenGL set to 3.2.

Anyway i m happy i figure it out.

If you give me the go i will add proper message in the log to let the programmer know of the Default API set to 3.2. Also i would suggest to remove that OpenGL2 backward compatibility code if it s useless anyway.

The problem is that decisions like this have support weight. After we switched to requiring OGL2 or greater, we got quite a few posts from folks complaining and that was MUCH older than pre-4 is now.

The majority of users are fine with the current default. Some percentage of them will not be fine moving it up. Indie devs can’t rely on new hardware for their players.

…also, the newer version is only needed if you use newer features. And in that case, if the docs/messages were better, you can set it yourself in those minority cases.

(I still get the occasional, but rare now, Mythruna user who can’t run it because their intel graphics potato won’t do OGL2. The OGL3.2 default is one of the reasons I’m scared to upgrade the JME I use these days… no time to track down all of the things like that which will cause me problems.)