HORRIBLE fps after upgrading my video card?

@t0neg0d said: What motherboard are you using?

990FXA-UD3 from Gigabyte.

@netsky08 said: Which leads me to the conclusion, ATI / AMD cards are the way to go to test you jme3 game's performance ;D

Oh yeah, that’s what I was thinking about in my sleep. I thought that it was an ideal situation, because if I continued developping with the GTX 650 Ti, I’d maybe ended up with lots of people with Radeon cards for who the game would have been unplayable. I mean, this card sucks so much in JME3 that I mean, I have to find a way to rewrite my project so that it runs smooth on MY card, if it does, it will run +200fps on GTX cards you know? So I kind of have the best (worst) card to test it upon :stuck_out_tongue:

Thank you for benchmarking, but… you used completely different settings than what I did. I had details on HIGH and resolution on 1680x1050. Can you retry with those settings just as a fair comparison please? :smiley: Thx

@.Ben. said:Thank you for benchmarking, but... you used completely different settings than what I did. I had details on HIGH and resolution on 1680x1050. Can you retry with those settings just as a fair comparison please? :D Thx
@netsky08 said: NB: About the general fps, I only tested on medium because I have 2 big screens and wanted to be able to spot the fps difference (which is what we are after as far as I understood)

If I go with your settings, both are rather low. DX around 14-18, OpenGL 10-15. So I guessed there was no point in posting those.

Oh really? Well… I guess the fps I got in Valley benchmark made sense then compared to your 6850 which is exactly my brother in law’s card that I will be testing my JME3 project tomorrow with. Since the 6850 gets like 1.7x worse fps in benchmark in both API’s then it should mean that it will get 5-6fps instead of 10fps I’m getting with this R9 270x right? I really don’t think so, but… we’ll see. IT’S PRETTY WEIRD.

re-ran the benchmark with your settings, I missed you we using a smaller screen, of course that changes things a bit:
OpenGL: average: 20, min: 7, max: 40
DirectX: average: 36, min: 9, max: 76

Seems that this topic comes up a lot on Minecraft forums with your video card. That’s another Java/OpenGL game, isn’t it?

Anyways, perhaps in one of threads I saw doing a Google search have some tips about increasing FPS or at least a hint as to what the specific issue is? Anyways, worth looking into. /shrug

EDIT: I couldn’t find anything specific about your motherboard/video card… however there was a topic about that motherboard and a similar AMD card and performance issues. Are you running the latest Bios?

@netsky08 said: re-ran the benchmark with your settings, I missed you we using a smaller screen, of course that changes things a bit: OpenGL: average: 20, min: 7, max: 40 DirectX: average: 36, min: 9, max: 76

Wow, that’s even more aggravating. Thank you for this report, Sébastien.

Hi again Chris,

Yes this system is brand new from last week and I verified that I’m indeed running the latest BIOS already. I’m on the latest (F2) which dates back 9 months ago: 2013/07/22.

I will be able tomorrow to prove if this motherboard/video card mix is imcompatible because my friend will install my card on his Crosshair V Formula which is a very high end motherboard in the AMD brand. We’ll see how the benchmark and my JME3 project runs on his system with my card. I also have another brand new R9 270x in the box, which we will try also just to see if /my/ card is defective.

But honnestly, the more I investiguate and read on forums, the more it’s clear that this clear is way overpriced for its OpenGL performance. It’s really doing better with DirectX like Rémy was suggesting, so all in all, I’m not really happy with this card but I am happy to have it because if I can manage to make my JME3 project run at 60fps on THIS card, it means it will pretty much run on every other system with way more than 60fps.

1 Like
@.Ben. said: Hi again Chris,

Yes this system is brand new from last week and I verified that I’m indeed running the latest BIOS already. I’m on the latest (F2) which dates back 9 months ago: 2013/07/22.

I will be able tomorrow to prove if this motherboard/video card mix is imcompatible because my friend will install my card on his Crosshair V Formula which is a very high end motherboard in the AMD brand. We’ll see how the benchmark and my JME3 project runs on his system with my card. I also have another brand new R9 270x in the box, which we will try also just to see if /my/ card is defective.

But honnestly, the more I investiguate and read on forums, the more it’s clear that this clear is way overpriced for its OpenGL performance. It’s really doing better with DirectX like Rémy was suggesting, so all in all, I’m not really happy with this card but I am happy to have it because if I can manage to make my JME3 project run at 60fps on THIS card, it means it will pretty much run on every other system with way more than 60fps.

Good thoughts for sure! Did you ever try taking out the water filter to see what sort of impact that was having on your framerate? (sorry… brain jumps around a bit in the morning)

@.Ben. said: But honnestly, the more I investiguate and read on forums, the more it's clear that this clear is way overpriced for its OpenGL performance.

…said about every ATI card ever. :wink:

@.Ben. said: It's really doing better with DirectX like Rémy was suggesting, so all in all, I'm not really happy with this card but I am happy to have it because if I can manage to make my JME3 project run at 60fps on THIS card, it means it will pretty much run on every other system with way more than 60fps.

As I mentioned before, it would be interesting to see the performance of the raw blue cube basic game and then you can maybe start adding things until performance falls off. Narrowing it down to one or two specific features might be helpful for everyone… especially if they turn out to be things with acceptable fallbacks.

Also, have you checked your games performance on full screen. One of the reasons I suggested turning back on the settings dialog is that you would be able to try a bunch of different options… including running full screen (which kills the SDK when I do it so it’s good to be able to tweak the settings each run). For example, it could be a combination of your app and the OGL->desktop integration that is slowing things down even more.

Yes Chris, all those questions have been cleared in the first 2 pages of this thread. If I disable EVERY post filter, the fps goes near 70fps, which is what my older nVidia card was giving with EVERYTHING activated. If I disabled all post filters with it, it was like +200fps… so there’s definitely a huge gap for no apparent reason.

Paul, in fullscreen mode, same resolution as the window itself, it makes absolutely no difference. Still at 10fps instead of 70fps with my older nVidia card.

It’s clear to me that the JME3 API has a lot of stuff that this card driver does not like.

@.Ben. said: Paul, in fullscreen mode, same resolution as the window itself, it makes absolutely no difference. Still at 10fps instead of 70fps with my older nVidia card.

It’s clear to me that the JME3 API has a lot of stuff that this card driver does not like.

Well, it seems to not like OpenGL very much… and JME is an OpenGL engine… so…

Some filters are more expensive than others… and the particular expenses of one filter over another will poke at different kinds of weaknesses. For example, I have a card here that runs Mythruna at less than 2 FPS while it runs Minecraft fine even with many of the beauty options turned on (this was some time ago, though)… where as the difference between running Minecraft on that machine and my other machines was not so different.

At the time Minecraft only used few textures and I had a different texture per material. This particular card seemed not to like that for whatever reason.

It would kind of be interesting to know what the specific weaknesses are for this particular card. Maybe there are work-arounds.

“TestHWSkinning” gives those fps, does that look OK to you guys? Please note the HW skinning mode TRUE or FALSE.

@pspeed said: It would kind of be interesting to know what the specific weaknesses are for this particular card. Maybe there are work-arounds.

I really hope so, because it would otherwise mean that half of people won’t be able to play games we develop using JME3 and that’s totally unacceptable :frowning:

I have high hopes in JME3 and I’m investing a lot of time and effort to learn how to use it properly, that’s why I couldn’t believe that a 250$ graphics card couldn’t run my project at more than 10fps while a 150$ nVidia card could run it at 70fps. I understand that the card does not like OpenGL, but what are we going to do? I have hope to start from scratch and code it in a way that the card will be able to keep the pace, but I’m so scared now that it will never work quite fine. I mean, most people don’t even have 250$ to put down on a graphics card to begin with, so imagine just how many people out there will never be able to run my app at 60fps… it’s scary.

that’s why you give the user the option to disable fancy graphics & special effects. So they can get the chance to play the game after all.
And there is people like me, who simply like “simpler” graphics.

I totally get it, but even without any effects, it was barely running correctly on that card (70fps) and by without any effects, I’m talking about no shadows, no water, no bloom, no volumetric light, no scattering, no nothing… it’s like SHADERLESS. I mean, yeah you can “play” it but all you see is like unshaded material :stuck_out_tongue: And I am not even done adding stuff not to mention physics and game logic, so it’s obvious that fps will be even lower when I’m done, which is already barely what it takes to play the game smoothly (the “60 fps barrier”) and that’s a 250$ AMD card… not a 100$ one! There are plenty of people actually using 6850 and such… I can’t wait to try this exact JME3 project on a 6850 tomorrow. Tomorrow I have so many tests to do with another computer, I’ll swap cards around and try a bunch of things to exactly know what’s going on, if this nonsense is “normal because it’s an AMD card” or not.

I want to say that this seems horrible… I hope that most AMD/ATI(when did AMD buy ATI or were they always the same company) doesn’t have the issue, but one thing that I am curious about and I have to ask is… Is there ANY overheating?

I actually had a “replacement” motherboard from dell(the best company ever…) that would literally overheat itself and clock down my Intel CPU with Intel Speed step to 300 mhz and freeze the comp.

Now you mention that the card’s settings are low in the control panel(I cannot read whatever language you have there) but it’s weird that it stays low. Is there any way to stop the GPU from clocing itself down(There was a way to turn off intel speed step, buit it only clocked half of my cpu still).

This probably isn’t your issue, but I figured I should at least share my experiences with others, and that way we might figure out something weird.

I find it amazing that any company could talk bad on OpenGL… Last I checked OpenGL has been around far longer than DirectX, and last I checked DirectX is only available on certain devices, whereas OpenGL is supposed to be portable to anything…?

Good luck, this stuff shouldn’t happen!

Hi Jason,

No, after 5 minutes of 99% utilization of the R9 270x overclocked (1175mhz core, 1475mhz ram) is at 47c which is very low for a quite highly overclocked card… but then again… we’re talking about 10fps… sigh :stuck_out_tongue: To note that this card has 3 fans and is HUGE, like you can’t have ANYTHING use space between the rear end of an ATX casing up to the drives bay. Not even a 12V wire fits in between, it’s like the biggest card available I think :stuck_out_tongue:

EDIT: Just realized while pressing ALT-PRNT-SCREEN, the temp went down by 1c to 46c, so… 47c is really the peak temp I think.

And another thing Jason, DELL really sucks IMHO. I had somebody last week ask me what to do with an ALIENWARE small form factor casing that was OVERHEATING A FRICKIN i7-3770, LIKE WTH!!! Before this AMD rig, this winter, I had a i7-3770K and a GTX 650 Ti and the computer was on 24/7 and I never managed to feel warmish air near the casing. It was literally blowing COLDER THAN AMBIANT AIR. DELL does not know sh** about computers really. They have no engineers to test their rigs and ALIENWARE is an overpriced joke over all.

That was off topic, but it felt good to share this publicly.

ANYHOW, back to the issue, the card sits on 10x slower when idle, but the second I launch my JME3 project, it goes 99% utilization for both the card’s core+ram, so I figure this “sleep” or “eco-drive” thing to save power is auto-cancelling itself and I don’t think there is a way to remove this. I’m using the latest beta driver and there is no such option to disable it. I don’t think it’s the issue tough since it goes 99% utilization as soon as I launch the JME3 project.

Thanks for sharing any idea you might have. So many people tried to help in the last 24 hours. I’m facing an overpriced card that hates OpenGL, that’s probably the only explanation we can draw from this experience… and it shocks me so hard, I don’t have words for this.