Any recommended new GPUs that work well with JME3?

Ah yes … Linus Torvalds said “F*** NVIDIA” and pointed one of his fingers to the camera (I don’t remember which one, but it looked funny :chimpanzee_wink:). He said that because of that poor Linux support.

So you could spend 114 on a GTX 750
its on sale right now.

750s can output 4k.and the card i i posted below does not need additional power(95% sure). so if gaming is not an option this should take care of everything you would need.
Frames rates on JME3+ rely heavily on cpu cores also. My intelbox(i-5 2500k) with a gtx 680 gets about 12k fps(in jme3+) .
While my amdbox (fx-8350) with that same gtx 680 moved from the intel box gets 22k fps.(in jme3+)
Other wise a 960(can power 4k) which you can find for 200$, bench marks slightly higher than my gtx 680, will take care of your needs with more room to game. I can speak at length about getting everything setup on linux properly and the hardware (I just JME3+ with many linux distros and i build computers for a living)

This. You can’t optimize what you can’t even run. Or if it barely runs then you waste lots of time.

Best to have one really good machine to develop on and one kind of crappy one to test on when you want to baseline performance.

Yes, I have a crappy one for such things. Others should too, you’re right. :chimpanzee_smile:

But you are kind of right … this whole QA thing gives me headaches. Whenever I see someone posting “works on this phone but not on that one” then I think: “Outch … later I will have to deal with that too.”

Note: it’s also important that the crappy one is at least reasonable.

I had some quad core xeon Dell here that is good for a server but bad for Mythruna. Even after I upgraded the GPU, I couldn’t make any performance improvements make sense. My guess is just that the bus is so bad that normal optimization strategies don’t even work. And anyway, the more time goes by, the less that level matters. :wink:

1 Like

Since there’s a billion and a half replies, I want to post this up here, for all to see…

So what happens if someone has an AMD card and your application doesn’t work well for them??? Anything we can do to fix the issues…???

Also, there were some recommendations for the GTX 750, however it says it’s max support is 2550x1600, so that wouldn’t work for 4k.

I also wanted to ask, since someone mentioned JMonkey also relies on the CPU heavily, if there’s arecommended CPU?

I’ve heard great things about the i7 4790(k) I believe is what it is…

Basically looking for new parts, but there is so much to choose from >(.

Thanks all :slight_smile:

So it’s my fault then? LOL… This guy legit paid like 400$ for a Radeon card, and was getting 1-15fps I believe.

I might go dig up that thread just to share :smile:

Good call on the r9 280. I was looking at a 380, which is basically the same card so… NTY AMD!

I also notice, when people are doing benchmarks it’s ALWAYS DirectX11 Benchmarks… Never OpenGL… It seems OpenGL isn’t as loved anymore on the Microsoft gaming side… But It’s used everywhere LOL.

Thanks for the info. Yeah if your code is poo, non-optimized and all that memory leaking loving jazz, then I can see stuff going down fast…

I currently use a GT550m and it works pretty much fine for my uses, but is a little slow and choppy/motion-blur…

I’m doing a bit for myself, and a bit for general public, so I do appreciate the videocard breakdown with Steam.

Awesome, what kind? There are a billion different kinda. One person said the 2/4gb doesn’t have much of a difference (not sure how that is), and another said that 4gb is what yu want…

There are so many companies… I was looking at MSI, from the link I posted, seemed to be a nice middle of the road one, coming in second on most of the tests.

Oh, yeah, for sure, I don’t need to spend a crazy amount for sure :slight_smile: .

Oh, what version of OpenGL does 3.0 work with? I’m assuming 3.1 will be up to date, when it comes out…?

Yeah, that’s the thing, I wanted to make sure my application would run in a multitude of card types, so I wanted to buy a few, but not sure if that’s the bast case.

The 750 huh? How much would that differ from the 960?

as for “deferred rendering pipeline” I had to look that up, and that seems to deal with shaders, which I haven’t dealt with? Not sure if you want to explain a bit about that.

Well, I have this GT 550m which I will use for my “slow test” :P, but it will be good to see how a good GPU can handle it all for sure.

So you think 750, and not a little better of a card? Again not too sure what the differences between that and the 960 are, which is why I ask :slight_smile: .

Thanks!

Thanks for this info, this is interesting!

So is DX12 out? I heard it is, but then I heard that the new “Pascal” cards will support it, so I wasn’t sure if it was out, or not… I also heard that SLI will add VRAM together, instead of copying it, so SLI GeForce cards should get a huge boost.

Not sure why they would be slower in DX12, but that doesn’t sound good. An additional feature shouldn’t be the end all for that, bu tI guess if it’s always on then it’s just the card being itself :stuck_out_tongue: … But still…

Thanks for the interesting info.

Yeah. I’ve always been told to have the slowest possible computer to test us, so if it works with that, it will work on any machine out there…

I will be still using this old laptop for testing, I think it’s a perfect test computer :slight_smile: .

Thanks for the tips, and I agreee. I have this Laptop for slow shizz, but I would like to get something that runs really good, with no hiccups. I guess the issue is going to come with this 4k monitor…

The 970 is the issue with 3.5/4gb of ram… right?

What’s hte “Oxide vs NVIDIA” thing? I’m waiting for Pascal, because if you look up Pascal cards, you’ll see they will shizz all over the Maxwell cards.

“proprietary” meaning MAC/Windows???

Good to know about Linux, but that leaves the rest of the AMD world…

First off… Nice Icon Mega Man Ftw…

Second… Maybe I’ll take a look at it… Not too sure what the diffences between that and the 960 are, but it’s almost half price…

What would 'additonal power" be? A bigger PSU? I haven’t picked one out yet…

GOod to know about CPU, because as I ran my game, my 2410m i5 was doing 50-98% on the CPU, at the start screen, but not too sure if it’s fully running, as I’m in the middle of editing shizz lol >(.

I’m not too sure what I’m going to do, as I’m looking to building a new computer.

2500k I heard is a great card, I also have a 2400 i5 that I could use, but might just get another computer.

Insane amount of FPS… I don’t think I can get anywhere near there with my laptop lolol >(.

If the 960 is just above a 680, where doesd the 750 fit in?

I don’t use Linux, but I appreciate it… So Idk what I should do… hmm… 22k fps sounds nice LOL…

Do you like MSI as a company? I’ve seen a lot of people recommend EVGA at first, then I saw a lot of MSI recommendations…

Thanks :smile:

Yeah uh… I had to do a GeometryBatchOptimization (yeah I forgot the class name… :p) within JME to get my shizz to even run LOL >(.

That’s why I want 1 good, and 1 poo machine :). I already got the poo, now I need the good.

Q/A ftw man… I don’t want to deal wit it… But I want to know how to LOL… Which is why I asked all the way at the top… What we do if our shizz doesn’t work for someone…

I mean there are millions of users who could possibly use an OpenGL application, so that means half of them could have issues with it? Insane…

Even the great pspeed couldn’t optimize it? How can the rest of us mortals fare… :stuck_out_tongue:

TBH it was probably dell that did something to limit something on it, so that it wouldn’t work…

Dell locked my MB’s BIOS so that it cannot run faster than 1333mhz clock speed with RAM. I have a 1600 corsair set, and it got clocked down… I could install a custom BIOS but fuggetaboutit… :stuck_out_tongue:

I wonder how hard it would be to make a custom bios… I was reading a stack exchange post that said you willl brick many MBs before you even get started LOL…

eh they are pretty much all the same card no matter what company, they all use the same reference board. You might get a quieter fan or perhaps some factory overclocking the more you spend.

Thanks!

I heard that the 4gb isn’t really utilized on the 960, people were saying it’s just not worth it, and I also heard that it has the same issues as the 970…

sigh… lol…

Idk what to go with >(.

Well 4GB probably won’t be used at the moment on any card, but give it 6 months when more gpu’s have 4GB and devs will probably start using it. Obviously I don’t have a crystal ball though lol.

For jmonkeyengine? probably 2GB is all you need.

The main reason I choose the 960 over the 780/7xx was that it will have better directx 12 support (ie the 700 series doesn’t have it at all) and AFAIK that has some pretty good performance implications. Also, its a newer gen card thus its performance comes at a much lower power usage cost and will generate less heat.

At the moment when most games don’t use directx 12 and are stuck on 11, the 7xx series is probably a little faster that the 9xx series but uses much more power, and when it come to directx 12 the 9xx series will perform much much better .

But this is all guesswork on my part lol

Yeah I’m not sure if it’s the games themselves, or if the hardware itself just isn’t good for the 4gb.

Thanks for the info…

Since we don’t use DirectX, I guess the point is moot…

so there’s a huge power consumption issue?

I also read on Amazon, that the 750ti doesn’t support 4k, but Nvidia’s site says it does… I would go with Nvidia’s site myself :stuck_out_tongue: .

Thank!

weeeeell its more that more power == more heat == louder fans. Also, IIRC the 7xx series recommended a higher wattage power supply than the 9xx’s.

I wasn’t worried about my electricity bill or anything.

For sure, just might not have had the right PSU for it :P.

Yeah … Idk what to do… Pascal come out already (prob issues with that too)…

Ugh… ><(

When it comes to components I get the same way and start freaking out over two barely different revisions of the same cpu LOL

Sometime you gotta just flip a coin and don’t worry about it. If it really was the wrong choice, you can always sell the card and buy a different one. There is never a “right” time to by pc components.

For sure lol :stuck_out_tongue:

This guy made an interesting comment

GTX 960 is a bad buy all around; it is priced too high for what it offers and hardly increases performance over the last gen price equivalent of today. Get a second-hand 770 and save your money for when Maxwell gets cheaper and more refined and you have a full 256 bit on a fully enabled GK104 gpu without any fuss whatsoever. 4GB on midrange cards is still worthless, just as it was with Kepler. You may see it being used, but only because the space is there, not because the GPU can actually work with all that data. SLI on two midrange cards offers a marginal performance advantage but you lose your advantage in price because midrange GPU prices drop faster than high end cards do, so if you are selling your SLI setup it will cost you money compared to selling a high end card.

960 4GB → avoid because 4GB won’t get you anywhere compared to the cheaper 2GB version
960 SLI → avoid b/c of price and fast value drop
970 SLI → avoid b/c of 3.5Gb issue (which will ONLY really show itself when you SLI this)

970 single → good buy for 1080p/1440p
980 single → highly priced buy for 1440p and up
R9 290 → still best price/perf ratio at this time, provided you get a solid cooler

That about sums it up. Stop wasting time on useless details that you never even managed to experience in-game, but only show in synthetic testing.

full 256-bit on a GK104 GPU…? @pspeed what’s this useful for?

I always see different values, sometimes 256-bit sometimes 512-bit, sometimes less… Not too sure what that’s about…

Doesn’t sound like it recommends anything LOL!

thoughts?

Idk wtf >(

My opinion over Nvidia, and why i kinda dislike them.

Anyway apparently I’m one of the MAD users here who never had any serious problems with the card.
Wongly working shaders are usually developer errors, the amd card behave usually 100% to specification of glsl, while the nvidia ones let some errors slip.
So guess what happens if the developer works on a nvidia card with the amd customers.

In compatibility mode. Moral of the story: Don’t use compatibility mode. It was a JME mistake that is corrected.

ATI drivers were a joke. They crashed often and barely supported OpenGL. They may have gotten better but I still don’t trust them. And my own experience is that performance falls off RAPIDLY once you get outside of what benchmarks are testing where as nvidia are more flexible across the board. Yes, for performance testing you want to use ATI/AMD for that reason… but for day-to-day development, I’d rather have the card the let me run any crazy thing I want at least once without bluescreening my OS.

Note: I’m biased because I’ve been doing OpenGL dev since 2000 or so and ATI really was a laughing out loud joke in the OpenGL community for a decade.

Holy shit that was funny…

That’s interesting to note… Have you used AMD with JME for a long time? What card would you recommend?

I would like to get 2 cards, just so we can make sure in runs on everything, but not sure if that’s card specific, or what… thoughts?

Thanks :slight_smile:

What’s “Compatibility mode?” :stuck_out_tongue:

Yeah, it’s possible after 15 years things changed, but I can feel your resentment as an “oldschooler” :p.

“Back in my day…”

I feel like getting 1 of each card… not sure if you think that’s wise or not for dev purposes?

Yep no argument here, amd dirvers were pretty crap untill like 5 years ago, but with the new knowledge they needed for the consoles, they are really improving the situation. I at least never had a real problem iwth amd cards, and my r9 somewhat < 200€ works for all games fine so far.

(I adopted the strategy of buying a mid grade card every 2-3 years, instead of a high end one every 5, it is still cheaper, in total and i have in average a better performance, as the high end cards usually get overtaken within 2 years by the cheap ones)

Is it possible to have 1 Radeon, and 1 GeForce card running at the same time (non SLI/Crossfire of course)…

The question is… how many African villages can you power with your gpu…

If you sue the medium cards, they are pretty good energy wise, and i would not recommnd mixing, as you will get serious driver problems. I cannot really recommnd mixing different cards of the same manufacturer after some cheap trial to use a 30€ card as a teritary monitor adapter:P (Well at least for windows, it worked surprisingly well under linux, even allowing opengl frames spanned over all monitors at the same time)