JMonkey Detect the Wrong OpenGl Version

I tried to steer you but little bit poorly. Sorry about that. It is always hard to know how experienced someone is with jME that do they pick these. I should make no assumptions and try to write proper suggestions and not the typical one liners.

And the focus was on the driver reporting wrong version too. So, a number of things going on.

where is this code?

The shading language version is configured in the matdef but it can be up to the latest supported by the selected core profile.
Eg. if you select core profile 4.20 but the shaders uses GLSL150 you will not have access to tessellation, so you need to change both.

I am a bit mad for my time wasted since this could’ve been written in the Log when you start your app.

I am sorry you wasted 2 weeks on this, we had similar but worse issues when we were using the compatibility profile: people with seemingly good drivers unable to get some stuff working as expected due to their compat profile being broken. So i think the right approach here would be to document setting the desired GL version on the get started wiki page.

You know, i was wrong of not trying to force the version. The reason i was so stubborn about it is simple, It was never the case for over 10 years. I look into the forum before asking and couldn’t see any post but this one when doing some google search. Renderer and GPU Capabilities

I stayed on version 3.5.2 for a long time because of all the issue i had with 3.6.1
If you don’t use the SDK, there seem to be an issue with the jme3-lwjgl, it keep asking for missing dependency. Took me a while to figure you had to import the Lwjgl3.

Failed to execute goal on project XXX The following artifacts could not be resolved: org.jmonkeyengine:lwjgl-platform:jar:2.9.5 (absent): Could not find artifact org.jmonkeyengine:lwjgl-platform:jar:2.9.5 in mvnrepository (https://repo1.maven.org/maven2/) -> [Help 1]

I have JCenter in my repositories and it still doesn’t find it.

Line 210 of LwjglContext in jme3-lwjgl3


But this code doesn’t look to make a Compatibility profile. It look like it get the actual Capabilities of the machine when no argument is given.
I suspect that the Capabilities coming out of here without argument should give this list of boolean with every supported version on the current machine. So this is where the AppSettings should get the supported version of OpenGL by default. (GLCapabilities.java)
https://www.cs.unh.edu/~cs770/lwjgl-javadoc/lwjgl-opengl/org/lwjgl/opengl/GL.html
image

When my game come out (In 20 years) i wouldn’t want the user to be force into a context but to chose the context his machine can handle. Then i can show the proper Options in the option menu. Like Tessellation should be a Flag option if the capabilities are present. Right now, the engine force 1 static Version in the AppSettings or in the Main when you create the settings. So unless the user can input an Argument (Which isn’t user friendly) My game is stuck with whatever Game API support it was set.
Before it use to be, Get Highest supported Version for my machine.

This cause a big regression in Flexibility.

I really believe the change to force OpenGL 3.2 is bad, also i really want to understand what you mean by
“compatibility profile” to me what you did is force everyone with a better OpenGL in “compatibility profile” instead of having the Native supported Profile for each computer.

I read a bit the Change comment in Git Hub and all i see is a Mac OS bug, If this is a bug for them, then wouldn’t it be better to tell Apple to fix the bug? Also if you run a Desktop with Mac Os (With incompatible parts) it might the issue for the compatibility profile you see. Just like Linux, not all part have a support at the moment. Debian 12(They remove my GPU support thought) is doing great for me, but when i was using Debian 8 it was a complete different story.

Also i am talking about Future API support by the engine. It should never force a Context on anyone at any moment. You never know what Nvidia or AMD will pullout next.

I wont argue anymore on this thread, but this is what i believe. The code might not need to be rollback, but i think it shouldn’t have any default OpenGL and should have a Minimum Requirement OpenGL value instead. With a way to identify the Highest API version available from the Capabilities instead.

This code doesn’t set the gl version, it just flags which features are supported by the combination of version + available extensions and some other platform specific checks.

Generally speaking, this is a bad idea, opengl is not backward compatible, you should always develop toward a minimum viable version, eg. 3.2, and if your game needs some capabilities of higher versions, you pick the lowest required version (eg. 4.0 for tessellation) and toggle it if the user hardware supports it.

So unless the user can input an Argument (Which isn’t user friendly)

You can restart the game with a different gl version without requiring the user to input it manually

In opengl there is the concept of “core profile” that is the compliant profile that closely follows the specifications and the “compatibility profile” that is an implementation specific mashup of old and new features. Several drivers don’t even bother updating the compatibility profile.

It should never force a Context on anyone at any moment. You never know what Nvidia or AMD will pullout next.

Forcing a minimum core profile is exactly to fix this. Drivers developers must implement 1:1 the specs to support a core profile, not so much for compatibility profile.

1 Like

I am sorry, but my point earlier was about how AAA game support all capabilities no matter the version you have.

If you go in any Modern game you will see that if your GPU doesn’t support a feature, it wont show in the game Options. That is because the game know your API capabilities and already have the proper API version running.

Going the way you did would just cause more issue for any game like mine which had a If(isTessellationCapable) to be worthless unless a engine restart is done which again is not how it should be. It use to work before and now my code is all broken because of that change. I am almost better just branching Jmonkey and using the previous code.

You got this line wrong. This is how Jmonkey was supporting my Version 4.6 by default. When it found OpenGL2 it would try to request a better version. Try it out you will see. Instead of forcing the context you force it would request a Current Available Context. In my case 4.6 core profile

Ok sorry, i didn’t notice you were referring to the lwjgl api. In that case, this line is not magically setting the right opengl version. It is starting the context in forward-compatible mode (aka core profile) for OpenGL>2, instead compatibility profile.

You can still do this by setting manually LWJGL_OPENGL2 in app settings, if you think this is the right way to go, but expect a clusterfuck of non reproducible platform specific issues.

I am sorry, but my point earlier was about how AAA game support all capabilities no matter the version you have.

I think only minecraft did that and they dropped it. Most AAA pick between a set of hardcoded versions they need. You can check the hardware support before starting the application, you can also restart the context, most AAA games restart the context for most graphic changes.

1 Like

This is true about using the variable LWJGL_OPENGL2. Which does in fact make it work as it use to.

1 Like

So no, the change shouldn’t be reverted, maybe documented somewhere.

I spent 2 weeks looking for a breaking change which had no documentation about it.

Both the Forum announcement and the blog announcement direct people to the official release notes, where the change tops the list of “potential breaking changes”.

Where else should such changes be documented?

This I find a bit weird. You should not need to import pure LWJGL, whether it is 2 or 3. That should all come with org.jmonkeyengine:jme3-lwjgl or org.jmonkeyengine:jme3-lwjgl3. And I’m talking about Gradle/Maven here.

The SDK you referred… probably you mean Ant project and all the libraries come from the SDK.

I would suggest one in the Code Log and one in the Example section(The wiki page). The in engine Log should state the enforce lower version in my opinion.

Here, it is not true that the issue is my Video Hardware. It should say that the OpenGL version is set to 3.2 by default.


And here when you start the engine
image
There should be a way to know it was enforce by the engine default value.

If i had seen something in those 2 place, it would have jump to my face right away.

I think the error is the most important. Since it s not the proper error message for the reason.
It should state something like this,

OpenGL API set to version 3.2, Tessellation(feature name) require version 4.0 or higher.

1 Like