OSX opengl version detection

Hi all,

I just installed a hackintosh so I can try my game on OSX and also build it for iOS but I’ve found a weird behaviour, by default the rendering context is detected and initialized as GL 2.1 although my graphics card supports 4.1, in fact I can play all my steam games flawlessly in that computer.

I’ve searched the forum for some information about and found some threads like:

So using settings.setRenderer(AppSettings.LWJGL_OPENGL3) makes it properly detect GL 4.1:

At least this way I can run jme3 apps on OSX but I assume this would just fail if trying to run the game in a GL2.x only card (really old hardware for sure…). Then here are my questions…

If LWJGL_OPENGL3 jme won’t fallback to GL2, is there any way in jme3 to detect GL versions before simpleapp.start() is run so I can set either LWJGL_OPENGL2 or LWJGL_OPENGL3 depending on the hardware? Being that the case, how? (I assume GLRenderer Caps is filled after start)

What happened to AppSettings.LWJGL_OPENGL_ANY? It’s in the javadoc of AppSettings.setRenderer() and also there’s some references in forum but it was removed…

Also I’ve tried all other constants LWJGL_OPENGL3* and LWJGL_OPENGL4* and all of them just except, is it normal?

Thanks

Try git blame on AppSettings to find it etc.

What means they except?

Note: What you experience is an Apple-Only thing, I think you can only either run OpenGL 2.0 Compat or OpenGL 3.1 core profile (and I guess higher core profiles, but we didn’t work on that yet).

So that being said, setting OPENGL3 shouldn’t even allow GL 4.1, I guess, but that’s definitely some apple thing to check out, it happens for regular openGL Applications as well

OK, I’ll have a look at AppSettings changes

I mean that using all other LWJGL_OPENGL generate an exception when running although I think most of them should work :confused:

I know apple is weird in comparison to others and that’s why I wanted to try how my game runs on it :stuck_out_tongue: