HORRIBLE fps after upgrading my video card?

@icamefromspace said: have you tried not getting the drivers from the ati website and instead use the drivers that windows distributes? the windows update will be distributing a slightly older version but I tend to find that they work better.

Usually the drivers you get on the ati website as the “bleeding edge uber beta” drivers that they put out to “compete” with nvidia who does the same thing. Generally I find they arent so reliable.

additionally the drivers you get from ati or nvidia directly come bundled with their crapware, which could potentially be running with some settings like power saver mode or customized quality settings and other stuff which I think is all kind of annoying.

This is good advice, too.

Though for intel cards, I think the advice would be to always go to the vendor. I’d steer that way for nVidia also.

the drivers in windows update are the ones from the vendor. just not bundled with the special control panel software that they have, its just the driver. if you click on the details for the drivers it tells you what the version number would be had you got it directly from the website.

@icamefromspace said: the drivers in windows update are the ones from the vendor. just not bundled with the special control panel software that they have, its just the driver. if you click on the details for the drivers it tells you what the version number would be had you got it directly from the website.

Yeah, but for intel cards it seems the windows provided drivers are older and buggier than what you get from intel.

Hi Daniel, thanks for sharing your ideas.

Here are the drivers that I tried which all made absolutely no difference at all in my case, still 10fps vs 70fps on my old nVidia card:

  1. Windows Update couldn’t find more appropriate drivers than GENERIC VGA CARD… (so I was stuck on 320x240)
  2. Drivers shipped on the installation CD
  3. Latest drivers from the website (but NOT the beta ones)
  4. Latest beta drivers from the website

None of the above made a difference, except for the 1) which of course… did not work at all.

I am absolutely puzzled as to why a 250$ brand new graphics card performs so poorly on JME3. What did you guys think about the Unigine Benchmark results I posted? Is this normal? I found that it was smooth when looking at the animation. I would figure that on HIGH setting, it’s normal to get 60fps on such a realistic scene benchmark, but I wouldn’t know for sure if it’s normal or not. Can you do that benchmark too @netsky08 and post results here please? Do the OpenGL renderer and the DirectX renderer please, as there is a big difference in fps between the two.

Thx

Not completely related to your issue…but doesn’t it make you rage to see that you have like 68 fps with directX and 42 with opengl? The common folk will conclude “directX is faster than opengl” seeing this because they can’t even imagine that it’s a vendor’s driver issue.
IMO ATI dug their hole…over years… and now it’s getting deep to the point that it’s gonna collapse…

It f***** does, Rémy. I totally understand that point of view.

I want to add that at all time in the execution of the JME3 app, my CPU sits on 13%. GPU is at 99%. I tried to overclock my GPU to core at 1175mhz (75mhz oc) and RAM at 1475mhz (75mhz oc) and it doesn’t change anything fps-wise.

@pspeed said: Maybe dump the settings to the console just to be sure the defaults aren't crazy.

And/or let the settings dialog show up so that you can more easily play with different settings to see if they matter.

Also, have you tried running a brand new Basic Game project? What about the JME tests?

Perhaps working forward from there can help find if there is a feature you are using that your new card just doesn’t like. At least that would fit all of the evidence presented so far (including that other people have no problem running apps other than yours).

Good point Paul, I’ll try a couple tests, but the problem is I don’t recall what fps I was getting with the older nVidia card, so I wouldn’t be able to do comparison. My brother in law has an older Radeon HD 6850 card, so I might do comparison with his card tonight, running exactly the same tests, we’ll see if my card is defective or what…

QUESTION: Is there a JME3 test that contains everything, like a heavy test, because all small tests are ridiculously bringing +500fps even like the WaterFilter test and such, but it does not really tell much, because even very old cards give good fps on single tests.

In the meantime, I tried to overclock further the card, I’m now +100mhz core and +100mhz RAM and… to my big surprise, in JME3 it makes ABSOLUTELY NO DIFFERENCE… I’m still at a horrible 10fps.

@pspeed said: Maybe dump the settings to the console just to be sure the defaults aren’t crazy.

And/or let the settings dialog show up so that you can more easily play with different settings to see if they matter.

Yes, I saw your message, I’m sorry but I’m still wondering how to exactly do that, any advice please? What code should I type to do that exactly?

System.out.println(“Settings:” + settings);

…and/or…
app.setShowSettings(true);

…depending on which part was confusing.

Oh OK, as simple as that. Well, OK here is the output:

[java]
run:
Settings:{UseInput=true, MinHeight=0, AudioRenderer=LWJGL, Height=768, MinWidth=0, Title=Test 7g, Renderer=LWJGL-OpenGL2, Icons=[Ljava.awt.image.BufferedImage;@6083dc9b, BitsPerPixel=24, Fullscreen=false, StencilBits=0, DepthBits=24, SettingsDialogImage=/com/jme3/app/Monkey.png, VSync=false, Frequency=60, Width=1280, Samples=0, DisableJoysticks=true, FrameRate=-1}
[/java]

Are you seeing anything weird in there?

@.Ben. said: Oh OK, as simple as that. Well, OK here is the output:

[java]
run:
Settings:{UseInput=true, MinHeight=0, AudioRenderer=LWJGL, Height=768, MinWidth=0, Title=Test 7g, Renderer=LWJGL-OpenGL2, Icons=[Ljava.awt.image.BufferedImage;@6083dc9b, BitsPerPixel=24, Fullscreen=false, StencilBits=0, DepthBits=24, SettingsDialogImage=/com/jme3/app/Monkey.png, VSync=false, Frequency=60, Width=1280, Samples=0, DisableJoysticks=true, FrameRate=-1}
[/java]

Are you seeing anything weird in there?

Is a refresh rate of 60 a valid setting for your video card/monitor? This could potentially cause issues if not.

Hi Chris,

Aren’t you living in the USA tough? That makes us living both in North America. Everything works at 60hz for us, it’s the default base refresh rate for anything electrical. Rémy and other europeans might say they run on 50hz, but definitely not us in north america. I’m 100% certain about this and I can see nowhere on the specs where it says a R9 270x can’t accept a 60hz refresh rate. I did a quick Google search and all I could come up with is that the card also supports faster refresh rate outputs… but it definitely supports 60hz, that’s for sure.

EDIT: Can you see anything fishy here? I’m talking about the RAM used by the GPU… seems weird to me. It’s a 4GB GDDR5, but… it looks weird, no? I also showed you supported resolutions and that 60hz refresh rate is supported on the right-side of the screenshot.

@.Ben. said: Aren't you living in the USA tough? That makes us living both in North America. Everything works at 60hz for us, it's the default base refresh rate for anything electrical. Rémy and other europeans might say they run on 50hz, but definitely not us in north america. I'm 100% certain about this and I can see nowhere on the specs where it says a R9 270x can't accept a 60hz refresh rate. I did a quick Google search and all I could come up with is that the card also supports faster refresh rate outputs... but it definitely supports 60hz, that's for sure.

For the record, even in the US you can have monitors that vary refresh rates and the available of refresh rates at different resolutions can vary widely. I have a monitor here that at full resolution and at full screen cannot do 60 hz. LCD based monitors may be a 60 or slower depending on how the choose to list their refresh rates. Tube monitors can be all over the place. My preferred resolution on an old Dell monitor was 75 hz which did weird things under florescent lights.

The point is that it has nothing to do with the electricity coming out of the wall… which is rarely exactly 60 hz anyway.

To your actual problem, refresh rate doesn’t matter in windowed mode as its ignored. You can only run at whatever the desktop is configured for anyway… so bits per pixel and refresh rate are not even sent through as I recall.

@pspeed said: To your actual problem, refresh rate doesn't matter in windowed mode as its ignored. You can only run at whatever the desktop is configured for anyway... so bits per pixel and refresh rate are not even sent through as I recall.

My bad… didn’t see that the app was being run in windowed mode. =)

What motherboard are you using?

https://imageshack.com/i/n7eg1ap
https://imageshack.com/i/nd98yup
It appears you are right, AMD /ATI performs about 1/3 worse in OpenGL than DirectX
Granted I never noticed it’s that much Oo
Still I say it’s reasonably high fps (in games) for my needs, as I tend to disable lots of fancy stuff, but I guess if you enable all the things you will run into problems.

Which leads me to the conclusion, ATI / AMD cards are the way to go to test you jme3 game’s performance ;D

NB: About the general fps, I only tested on medium because I have 2 big screens and wanted to be able to spot the fps difference (which is what we are after as far as I understood)
Futher more … the card is getting old for benchmarking :wink: