I am using this screen resolution: 1280x1024 +0+0 {viewportout=1280x800+0+0} (yes the bottom of the screen is being discarded)
Screen 0: minimum 8 x 8, current 1280 x 800, maximum 8192 x 8192
DVI-I-1 connected primary 1280x800+0+0 (normal left inverted right x axis y axis) 310mm x 230mm
1280x1024 60.0*
but on exiting the running application, it becomes as below, what forces me to restore my special settings…
Screen 0: minimum 8 x 8, current 1280 x 1024, maximum 8192 x 8192
DVI-I-1 connected primary 1280x800+0+0 (normal left inverted right x axis y axis) 310mm x 230mm panning 1280x1024+0+0
1280x1024 60.0*
is there some way to prevent JME from “restoring” the previous screen resolution? Mostly because I am not using the application in fullscreen, therefore I understand it shouldnt try to restore at all, right?
Okay, sorry, I don’t get your problem. But it is true, that JME sets the last resolution you set on the settings screen for the next time, except you are using your own AppSettings.
AppSettings as = new AppSettings(true);
as.setResizable(true);
as.setWidth(1024);
as.setHeight(700);
ConsoleTestI.i().setSettings(as);
ConsoleTestI.i().setShowSettings(false);
if there was something like: as.setRestoreResolutionOnExit(false) it would do the trick. I tried to deep debug to see exactly where it restores the resolution but I didnt have time yet…
Basically, it is restoring the resolution in a wrong way. It should be 1280x1024 with viewportout=1280x800, but the viewportout is becoming 1280x1024, while the input remains at 1280x800 (the input should at least become 1280x1024 to things not get messy), so the screen is getting stretched and glitchy.
I think JME doesnt know about viewportout at all… neither the input (I think it is called viewportin)
I don’t know if it’s just me but I’m also having trouble understanding the issue.
What is ConsoleTestI.i() returning? I don’t understand how your screen size is 1280x1024 in the first place when you’re setting 1024x700 in AppSettings.
Are you thinking that the ViewPort size should persist between restarts? (Default ViewPort size is == to window size and I can’t imagine any changes to it persisting.) If that’s the case I would think that you’ll have to restore your custom ViewPort size on every start up.
My take on the issue so far:
-his Linux desktop runs with resolution X
-he runs his app in windowed mode
-exits the app
-now his Linux desktop runs in resolution Y
…but that’s not JME doing that. At best it’s LWJGL. More likely it your OS/drivers resetting something when the OpenGL context is released.
At least that’s my guess having never seen this or anything like it before.
That occurred to me as a possibility at first too. And I’ve seen full screen apps do that kind of thing, but never a windowed application. If it’s the Linux desktop resolution changing, more info about the OS & drivers is needed.
@pspeed I am quite puzzled now, may be I should try to run some windowed (not full screen) any other application, preferably in the same java + lwjgl libs to see if the same thing happens.
I think I can track all system libs with strace if it is needed too but will take some time
EDIT: I must add that, if I kill the application, this problem wont happen, may be a deep debug could let me track the exact spot where it happens, I just expect have all sources to let me do that
Ah, you’re green team.
Drivers - do you have nouveau (open) or proprietary drivers installed? I guess based on the version # that’s proprietary. Assuming this is an OS level issue (I’m still not sure?) a switch (if you can) could make a difference so you can at least move forward without the annoyance. Driver is reasonably up to date?
Checking to see if the issue occurs with some simple examples to see if it reproduces would be helpful. If you could put up a small example which reproduces it (or a video/pic) I think people will understand the issue better.
Is your DESKTOP resolution changing at all? Or are you talking about a viewport’s size not persisting? That would be expected behavior…
@louhy
“green eye team” lol (despite I am vegetarian too hehe), my old nvidia had a better price than amd and as I remember it had better linux drivers (at that time), I dont know how things are nowadays :>
Yes, NVidia proprietary drivers that is. It could be 340.96, but I am usually scared of upgrading that driver and also the linux kernel as I had serious troubles in the past; may be a small HD partition to test without messing my workstation could be an option
Yes, I think it is just the viewport(EDIT)in that is being modified (like it is being set to default) when the resolution is needlessly restored to 1280x1024.
The above is my old CRT defective monitor using 1280x1024 with viewportout 1280x800 properly. Below is my old notebook receiving the same image (1280x800) thru NoMachine over a crossover ethernet cable at 15fps (lol).
Here is after the problem happens, see how the screen image gets stretched? I think it is because the viewport in EDIT: gets modified while the viewportout remains the same? wow, that’s getting confusing now…
PS.: well, for now I made this loop so it is not that annoying:
#!/bin/bash
eval `secinit`
SECFUNCuniqueLock --waitbecomedaemon
while true;do
# WARNING: the X tools (xdotool, xrandr...) if being run too often for some hours may make the machine suddenly reboot :(
if ! xrandr |grep "current 1280 x 800";then
nvidia-settings -a CurrentMetaMode="1280x1024 +0+0 {viewportout=1280x800+0+0}";
fi;
sleep 5;
done
Okay now the problem is clear… and that would be very annoying. Can you set the viewport size back to normal just before exiting? This happens when running in windowed mode? That’s a little surprising. Is your card very old? Maybe the driver is ancient and isn’t getting updates it needs. (No idea what NVidia’s EOL/support schedule is like.)
@louhy
I dont see a way do to that b4 exiting, the best option seems to fix after the problem happens, I just wait a few seconds and it is ok now.
Yes in windowed mode!
Very old GeForce GTS 250, 500mb.
I think it (the driver) will just fall into oblivion (would be better if it was into skyrim). The gpu card’s contacts are failing too from time to time hanging the machine, better buy a new GPU, unfortunately they arent cheap here…
Anyway, I will see if I can debug and find the exact moment it happens to know who is doing it, I guess will be some native call.
I mean you can’t just call setViewPort() again just before calling stop()? Maybe you’d need to set a flag to defer the call to stop() for a bit so things have a chance to react (not sure how gracefully stop() quits) but maybe that’s not even needed.
camera.setViewPort(0f, 1f, 0f, 1f);
I’ve been assuming that you’re intentionally making the viewport smaller in your application at startup - is that incorrect?
Btw, I just tested opengl outside java, with glmark2, it was windowed and the problem didnt happen. Must have something to do with lwjgl or some java thing…
The problem is, if it is saved it will be restored, and there is no check (made by lwjgl) to see if it is in windowed mode before storing it…
A bad temporary workaround/“fix” was to give the user an initialization command option of letting it be fixed by hacking into a private static field XRandR.savedConfiguration=null see here.
A good fix would be to fork lwjgl or to find a newer JME compatible version, if someone finds it, I would like to know :).
Offtopic about GPUs
@louhy may be red :), depends on how much ppl are complaning, the same research I did in old times hehe.
Nvidia does make great hardware, but like Linus was saying in that video, I don’t appreciate their closed nature and apparent hostility towards user freedom. AMD seems far more willing to work with their customer base directly, and nowdays is generally a far more cost-effective deal, so they say. I don’t see anything like this coming any time soon, but if AMD were to drop out of that market entirely, it’d be terrible… so I try to support the underdog when I can. Haven’t had any need to go back to NV for years now. So no, I don’t own AMD stock or anything.