Strange problem with new setup

I finally got a new PC, and decided to revisit some of my old projects. However, for some reason, I can’t use any anti-aliasing, because if I set it to 2x or more, any jMonkey game I tried crashes before starting with the following log:

SEVERE: Failed to create display
org.lwjgl.LWJGLException: Samples > 0 specified but there's no support for GLX_ARB_multisample
	at org.lwjgl.opengl.LinuxDisplayPeerInfo.initDefaultPeerInfo(Native Method)
	at org.lwjgl.opengl.LinuxDisplayPeerInfo.<init>(
	at org.lwjgl.opengl.LinuxDisplay.createPeerInfo(
	at org.lwjgl.opengl.DrawableGL.setPixelFormat(
	at org.lwjgl.opengl.Display.create(
	at org.lwjgl.opengl.Display.create(
	at com.jme3.system.lwjgl.LwjglDisplay.createContext(
	at com.jme3.system.lwjgl.LwjglAbstractDisplay.initInThread(

Dec 12, 2018 10:57:28 PM com.jme3.system.lwjgl.LwjglAbstractDisplay run

Which is weird, since the output of glxinfo is:

primusrun glxinfo | grep GLX                                         :(
    GLX_ARB_context_flush_control, GLX_ARB_create_context, 
    GLX_ARB_create_context_no_error, GLX_ARB_create_context_profile, 
    GLX_ARB_create_context_robustness, GLX_ARB_fbconfig_float, 
>>  GLX_ARB_multisample, GLX_EXT_buffer_age, 
    GLX_EXT_create_context_es2_profile, GLX_EXT_create_context_es_profile, 
    GLX_EXT_framebuffer_sRGB, GLX_EXT_import_context, GLX_EXT_libglvnd, 
    GLX_EXT_stereo_tree, GLX_EXT_swap_control, GLX_EXT_swap_control_tear, 
    GLX_EXT_texture_from_pixmap, GLX_EXT_visual_info, GLX_EXT_visual_rating, 
    GLX_NV_copy_image, GLX_NV_delay_before_swap, GLX_NV_float_buffer, 
    GLX_NV_robustness_video_memory_purge, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer, 
    GLX_SGI_swap_control, GLX_SGI_video_sync
    GLX_ARB_create_context, GLX_ARB_create_context_profile, 
GLX version: 1.4
GLX extensions:
    GLX_ARB_create_context, GLX_ARB_create_context_profile, 
94 GLX Visuals
215 GLXFBConfigs:

So my graphics card clearly supports GLX_ARB_multisample. And even though this is not an high end GPU (Nvidia Geforce MX 110), my old computer could run x16 antialiasing without problem (only at excruciating FPS’s, of course :wink: )

However, if I switch to the integrated Intel GPU (it’s an optimus system), then it launches, but anti-aliasing is clearly not being applied, however the FPS get an impact. (~216 FPS without antialiasing, ~131 FPS with x16 samples)

I find this very odd, has anyone experienced similar issues?


From past experience MX cards and laptops with integrated graphics can cause all manor headaches. So I don’t have a fix, but I can at least confirm strange behavior.

If its only AA you are after, shader AA is probably a better solution moving forward anyway.

1 Like

Thanks, it’s just that I find it weird I only had this kind of problem with jME. Even though making the goddamn drivers to work was really hard, after I got them running, I’ve tested a lot of games on this machine and so far I had no major problems like this one.

And I recall another problem I found with jMonkey related to fullscreen and refresh rate (which still happens) and can’t help wondering if it’s a problem with jME.

Does the engine run on LWJGL3 yet, by the way? That could magically fix these issues, idk…

If you tell it to.

1 Like


Ok I found this:

So all I need to do is swap the old “lwjgl” lib with this one? And is it possible to download it manually, I don’t know how to work with Gradle/Maven…

Nope, you also will need to add the native jars for it. It would be a hassle to do it manually.
Simple way is to create a jme gradle project template with


and install Gradle plugin on your IDE then open the project you just created.

1 Like

Sure, you can browse and search for “jme3-lwjgl3” and download all the jars specified there.

1 Like