Laptop with two graphic cards (Integrated + Nvidia)


Today I bought a new laptop to program my game, and I had a strange issue which I resolved playing with the Nvidia driver.

Laptops nowadays often come with two GPUs, the integrated Intel chipset, and the advanced card. By default, JME uses the Intel cards, resulting in very poor performances. Hopefully, the Nvidia driver can override the card choice and force itself as renderer.

But then I understood why I didn’t notice this mistake : the AppSettings display doesn’t allow you to choose which GPU to use. If I seen “Intel” I would have changed the display to NVidia.

Is it a voluntary limitation ? Something I missed somewhere ?

Since I have two drivers, how comes the Nvidia driver can force which rendered to use ?

This is an issue with your driver/OS setup most probably, lwjgl correctly reports that it wants “advanced” graphics support to the OS.

Okay, then I suppose it’s a problem with the NVidia driver. when set to “Autoselect” it selects the Intel card (which is stupid) o.o

Thank you Normen :slight_smile:


lwjgl correctly reports that it wants “advanced” graphics support to the OS.

Can you verify this statement to be absolute across all platforms with a link? I'm not saying you're wrong or anything, but the only thing that i've been able to find is that lwjgl always selects device[0] when searching for display adapters which generally defaults to the onboard gpu for devices that have the Intel crapset.


Device[1] would be another card, in the dual-setups both cards count as one and the same to the system afaik. You activate the advanced mode by calling certain gl checks, maybe @Momoko_Fan can give more details.

There are several OpenGL contexts. I suppose each GPU exposes a context, in addition to the software based one.

LWJGL is only designed to choose a hardware based context, which in this case is any context that is designated as hardware accelerated. There’s no way to know which hardware accelerated context is better (there’s no measure like this one has X FLOPS and this one has Y) and LWJGL doesn’t expose the names of these contexts, it simply chooses one for you and there’s no way to control that.

From a quick look at the JOGL2 API, I am not seeing support for this either. @gouessej may correct me if I am wrong.

To properly specify which GPU is used, you need to go into the control panel of your driver and select the proper context there.

That’s what I was thinking…

Well, I guess the only thing to do is hope that your BIOS will allow you to switch your primary adapter from onboad to pci/agp: