On exiting application, jme sets the screen resolution to what it was?

I am using this screen resolution: 1280x1024 +0+0 {viewportout=1280x800+0+0} (yes the bottom of the screen is being discarded)

Screen 0: minimum 8 x 8, current 1280 x 800, maximum 8192 x 8192
DVI-I-1 connected primary 1280x800+0+0 (normal left inverted right x axis y axis) 310mm x 230mm
   1280x1024      60.0* 

but on exiting the running application, it becomes as below, what forces me to restore my special settings…

Screen 0: minimum 8 x 8, current 1280 x 1024, maximum 8192 x 8192
DVI-I-1 connected primary 1280x800+0+0 (normal left inverted right x axis y axis) 310mm x 230mm panning 1280x1024+0+0
   1280x1024      60.0* 

is there some way to prevent JME from “restoring” the previous screen resolution? Mostly because I am not using the application in fullscreen, therefore I understand it shouldnt try to restore at all, right?

Not sure if I got your problem …
Did you try to set your own AppSettings?

yes, I am using it,
oh, I didnt mention I am on linux…

Okay, sorry, I don’t get your problem. But it is true, that JME sets the last resolution you set on the settings screen for the next time, except you are using your own AppSettings.

here is what I am using:

		AppSettings as = new AppSettings(true);

if there was something like: as.setRestoreResolutionOnExit(false) it would do the trick. I tried to deep debug to see exactly where it restores the resolution but I didnt have time yet…

Basically, it is restoring the resolution in a wrong way. It should be 1280x1024 with viewportout=1280x800, but the viewportout is becoming 1280x1024, while the input remains at 1280x800 (the input should at least become 1280x1024 to things not get messy), so the screen is getting stretched and glitchy.

I think JME doesnt know about viewportout at all… neither the input (I think it is called viewportin)

I don’t know if it’s just me but I’m also having trouble understanding the issue.

What is ConsoleTestI.i() returning? I don’t understand how your screen size is 1280x1024 in the first place when you’re setting 1024x700 in AppSettings.

Are you thinking that the ViewPort size should persist between restarts? (Default ViewPort size is == to window size and I can’t imagine any changes to it persisting.) If that’s the case I would think that you’ll have to restore your custom ViewPort size on every start up.

My take on the issue so far:
-his Linux desktop runs with resolution X
-he runs his app in windowed mode
-exits the app
-now his Linux desktop runs in resolution Y

…but that’s not JME doing that. At best it’s LWJGL. More likely it your OS/drivers resetting something when the OpenGL context is released.

At least that’s my guess having never seen this or anything like it before.

1 Like

That occurred to me as a possibility at first too. And I’ve seen full screen apps do that kind of thing, but never a windowed application. If it’s the Linux desktop resolution changing, more info about the OS & drivers is needed.

1 Like

I do all my development on Linux and never had this happen. Both LWJGL2 and LWJGL3 work as expected.

@pspeed I am quite puzzled now, may be I should try to run some windowed (not full screen) any other application, preferably in the same java + lwjgl libs to see if the same thing happens.

Ubuntu 14.04
NVidia Driver Version: 340.76

Libs in the classpath in this order


some system libs


I think I can track all system libs with strace if it is needed too but will take some time :slight_smile:

EDIT: I must add that, if I kill the application, this problem wont happen, may be a deep debug could let me track the exact spot where it happens, I just expect have all sources to let me do that :confused:

Ah, you’re green team. :stuck_out_tongue_closed_eyes:
Drivers - do you have nouveau (open) or proprietary drivers installed? I guess based on the version # that’s proprietary. Assuming this is an OS level issue (I’m still not sure?) a switch (if you can) could make a difference so you can at least move forward without the annoyance. Driver is reasonably up to date?

Checking to see if the issue occurs with some simple examples to see if it reproduces would be helpful. If you could put up a small example which reproduces it (or a video/pic) I think people will understand the issue better.

Is your DESKTOP resolution changing at all? Or are you talking about a viewport’s size not persisting? That would be expected behavior…

1 Like

“green eye team” lol (despite I am vegetarian too hehe), my old nvidia had a better price than amd and as I remember it had better linux drivers (at that time), I dont know how things are nowadays :>

Yes, NVidia proprietary drivers that is. It could be 340.96, but I am usually scared of upgrading that driver and also the linux kernel as I had serious troubles in the past; may be a small HD partition to test without messing my workstation could be an option :slight_smile:

Yes, I think it is just the viewport(EDIT)in that is being modified (like it is being set to default) when the resolution is needlessly restored to 1280x1024.

The above is my old CRT defective monitor using 1280x1024 with viewportout 1280x800 properly. Below is my old notebook receiving the same image (1280x800) thru NoMachine over a crossover ethernet cable at 15fps (lol).

Here is after the problem happens, see how the screen image gets stretched? I think it is because the viewport in EDIT: gets modified while the viewportout remains the same? wow, that’s getting confusing now…

PS.: well, for now I made this loop so it is not that annoying:

eval `secinit`
SECFUNCuniqueLock --waitbecomedaemon
while true;do
  # WARNING: the X tools (xdotool, xrandr...) if being run too often for some hours may make the machine suddenly reboot :(
	if ! xrandr |grep "current 1280 x 800";then
		nvidia-settings -a CurrentMetaMode="1280x1024 +0+0 {viewportout=1280x800+0+0}";
	sleep 5;

Okay now the problem is clear… and that would be very annoying. Can you set the viewport size back to normal just before exiting? This happens when running in windowed mode? That’s a little surprising. Is your card very old? Maybe the driver is ancient and isn’t getting updates it needs. (No idea what NVidia’s EOL/support schedule is like.)

1 Like

I dont see a way do to that b4 exiting, the best option seems to fix after the problem happens, I just wait a few seconds and it is ok now.

Yes in windowed mode!

Very old GeForce GTS 250, 500mb.

I think it (the driver) will just fall into oblivion (would be better if it was into skyrim). The gpu card’s contacts are failing too from time to time hanging the machine, better buy a new GPU, unfortunately they arent cheap here…

Anyway, I will see if I can debug and find the exact moment it happens to know who is doing it, I guess will be some native call.

I mean you can’t just call setViewPort() again just before calling stop()? Maybe you’d need to set a flag to defer the call to stop() for a bit so things have a chance to react (not sure how gracefully stop() quits) but maybe that’s not even needed.

camera.setViewPort(0f, 1f, 0f, 1f);

I’ve been assuming that you’re intentionally making the viewport smaller in your application at startup - is that incorrect?

No, the viewportout is a linux X server (the windowing system) setting, here is how it’s configuration file sector looks like:

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "Stereo" "0"
    Option         "nvidiaXineramaInfoOrder" "CRT-1"
    Option         "metamodes" "1280x1024 +0+0 {viewportout=1280x800+0+0}"
    Option         "SLI" "Off"
    Option         "MultiGPU" "Off"
    Option         "BaseMosaic" "off"
    SubSection     "Display"
        Depth       24

So basically, I am only setting the application to be windowed. The viewport is a linux X system thing.

Anyway I tested it but didnt work:

public void stop() {
	getCamera().setViewPort(0f, 1f, 0f, 1f);

Btw, I just tested opengl outside java, with glmark2, it was windowed and the problem didnt happen. Must have something to do with lwjgl or some java thing…

Oh, different viewPort… :frowning: I’m seeing a newer GPU in your future (colored red?) :slight_smile:

1 Like

confirmed it is a lwjgl “feature” (I grabbed the correct “lwjgl-sources” here) (too see its version just look at your application console output).

XRandR.restoreConfiguration() line: 221	
LinuxDisplay$2.run() line: 645	
AccessController.doPrivileged(PrivilegedAction<T>) line: not available [native method]	
LinuxDisplay.resetDisplayMode() line: 643	
Display.reset() line: 1105	
Display.access$000() line: 65	
Display$5.destroy() line: 838	
Display.destroy() line: 1095	
LwjglDisplay.destroyContext() line: 158	
LwjglDisplay(LwjglAbstractDisplay).deinitInThread() line: 196	
LwjglDisplay(LwjglAbstractDisplay).run() line: 237	
Thread.run() line: 745	

Basically, the configuration passed to xrandr on linux does not consider the viewportin/out: “[DVI-I-1 1280x1024 @ 0x0 with 60Hz]”

It is stored even before JME can do anything:

XRandR.saveConfiguration() line: 212	
LinuxDisplay$3.run() line: 745	
LinuxDisplay$3.run() line: 743	
AccessController.doPrivileged(PrivilegedAction<T>) line: not available [native method] [local variables unavailable]	
LinuxDisplay.init() line: 743	
Display.<clinit>() line: 138	
LwjglDisplay.createContext(AppSettings) line: 112	
LwjglDisplay(LwjglAbstractDisplay).initInThread() line: 113	
LwjglDisplay(LwjglAbstractDisplay).run() line: 211	
Thread.run() line: 745	

The problem is, if it is saved it will be restored, and there is no check (made by lwjgl) to see if it is in windowed mode before storing it…

A bad temporary workaround/“fix” was to give the user an initialization command option of letting it be fixed by hacking into a private static field XRandR.savedConfiguration=null see here.

A good fix would be to fork lwjgl or to find a newer JME compatible version, if someone finds it, I would like to know :).

Offtopic about GPUs

@louhy may be red :), depends on how much ppl are complaning, the same research I did in old times hehe.

1 Like

Try using the LWJGL3 backend and see if it fixes anything. It’s based on GLFW which should be a lot better with things like this.

1 Like

Nvidia does make great hardware, but like Linus was saying in that video, I don’t appreciate their closed nature and apparent hostility towards user freedom. AMD seems far more willing to work with their customer base directly, and nowdays is generally a far more cost-effective deal, so they say. I don’t see anything like this coming any time soon, but if AMD were to drop out of that market entirely, it’d be terrible… so I try to support the underdog when I can. Haven’t had any need to go back to NV for years now. So no, I don’t own AMD stock or anything. :smile: