Graphic cards Quadro vs GeForce

I'm working on a simulator project using jMonkeyEngine and now we need some more PCs to fit in the setup. All developer and lab PCs are Dell Precision T3400 with a Quadro FX 4600 graphic card.



I've done a lot of searcing on the web and it seems that the Quadro FX 4600 and the GeForce GTX 8800 are hardware wise almost identical. My inpression is that Quadro is for desktop use using 3DS max, Autocad, Pro Engeneer and such programs, and the GeForce is for gaming. Our conclusion was that a gaming oriented card would perform better since our simulator is based on gaming tecnology. Therefore we bought a GeForce GTX 295 which is one of the most high end GeForce cards



We inserted the GeForce GTX 295 into this a new Dell Precision T7500 and ecpected a really good framerate, but we were disappointed. 20fps showing 1.3million triangles. Using this new pc with my old Quadro FX 4600 we got 50fps showing 1.3million triangles



I think the results are a bit strange since I would expect the GeForce GTX 295 to outperform the old Quadro FX 4600 by a whole lot…



Are there anyone else who has experiences using both Quadro and GeForce.

Empire Phoenix said:

Simple idea, if they are harrdware identical (mostly) is it possible to force the quadro driver for the gtx?


You can do a soft mod using RivaTuner on some of the cards, the option is called "unlock professional features" IIRC, it worked pretty well with that 8800GTS I mentioned earlier, but it crashed Maya during Gelato renders (the cool thing at the time :P)

I've done a lot of testing with driver and benchmarks. All benchmarks (all OpenGL) has favored the GeForce card???



I've also tested the terrain.TestIsland from jMonkeyengine. The GeForce gets 320fps and the Quadro gets 250fps both at 262k tris. A strange thing happens if I go into wireframe mode. The Geforce drops all the way to 84fps but the quadro only drops to 216fps.



I will try to add more islands to get up to 1,3M tris and see what happens

I had both an FX4600 and an 8800GTS 320mb (which was the slightly cut-down version of the 8800 GTX)… For Direct-X gaming the 8800 was a better card, but for OpenGL performance, the Quadro was almost always the winner.



Since jME is based on OpenGL, it wouldn't be totally out of the blue for the Quadro to beat a GeForce, but it shouldn't be by that much, especially considering the generational gap between the two cards…



One thing about Nvidia's though is that the drivers lack consistency, 19x.x may support more cards and features, but 17x.x may give better performance, all while 18x.x is filled with bugs.  The Quadros are a bit better in this department as Nvidia provides certified performance drivers for the cards, which seem more stable between release (which, for Quadros, are not often anyway).



So my suggestions?

Try testing out drivers and hit Guru3D or some other 3D Digital Content Creation forums for advice one what works best with which cards.  Be careful of advice from people claiming there's no difference between Quadro and GeForce as well as people suggesting drivers based on DirectX performance :smiley:

It's rather strange that GTX295 would fail to render 1.3million tris fast enough. I am thinking maybe proper optimizations are not used? For example, are you using VBO? Is the shading of the triangles expansive e.g are textures, lighting or shaders used?

Momoko_Fan said:

It's rather strange that GTX295 would fail to render 1.3million tris fast enough. I am thinking maybe proper optimizations are not used? For example, are you using VBO? Is the shading of the triangles expansive e.g are textures, lighting or shaders used?


Assuming he's running the same code on both cards (for the sake of scientific validity I'll assume he is from the nature of his post), it's indeed very strange that the 295 was slower, the difference in available processing power is fairly large..  Even with optimized code, this 'problem' would likely still be seen however

Simple idea, if they are harrdware identical (mostly) is it possible to force the quadro driver for the gtx?

Did you install the drivers correctly? I have a GTX 295 and I'm able to run faster than that. Quick question though, how do other games run on your GTX 295? are you able to get the rated performance for them? Try throwing in call of duty or something and see if you can get max res similar to an online review that you can find of call of duty for the GTX 295.(Remember to enable dual graphics cards)



but try out new drivers and in the future if you are going to buy cards for your project, get the GTX 275 because a 295 is really just 2 275's sandwiched into one card and I think JME(Among any program not specifically made to run SLI) will only run one GPU. While the 275 is still an amazing card at less than half the price.(And you are really only running a 275 anyway because you are only running half of your card).



If you feel you really need the highest performance for your JME app, the 285 would probably give better performance than the 295 for the reasons stated above because the 285 is a single GPU card.

I think JME(Among any program not specifically made to run SLI) will only run one GPU

What makes you say that? What exactly needs to be done to make jME support SLI better? I think you're missing something here, SLI is something the driver should take care of, not the application.

Maybe I'm wrong about who's job it is(I've seen patches to games that mention SLI performance so I assumed it was something games implemented). But that's a bad assumption I presume, but I do get the same performance Single GPU as I do Dual GPU. Unless the FPS is running in the update thread and the FPS is actually an UPS(Updates per second).

Momoko_Fan said:

I think JME(Among any program not specifically made to run SLI) will only run one GPU

What makes you say that? What exactly needs to be done to make jME support SLI better? I think you're missing something here, SLI is something the driver should take care of, not the application.


This is correct, the drivers decide which card renders what part of the screen, though it's true that some games do see more improvement through the use of SLI than other.  I do not know the reasons for this, but I believe I've even seen differences within games made on the same engine.  ;)

Keep us posted!

Why are you not considering an ATI card? NVidia is great with textures, but ATI is faster when it comes to computational power, especially shader-code.

I did the test some time a go with the Island and added al lot of copies which I then put different translations and rotations on. The GeFroce card outperformed the Quadro again as I remembered it…



Since then we have focused on other stuff, and just bought the Quadro cards since those worked like a charm…



This may have something to do with the number of nodes put together, and we have a lot of nodes in the scene graph…



We have not considered ATI cards because one of our first tests with jME failed on the ATI at a home comeputer. Since then we didnt bother to spend time figuring out what the problem was since we only used Nvidia Quadros at work.




buestad said:

I've done a lot of testing with driver and benchmarks. All benchmarks (all OpenGL) has favored the GeForce card???

I've also tested the terrain.TestIsland from jMonkeyengine. The GeForce gets 320fps and the Quadro gets 250fps both at 262k tris. A strange thing happens if I go into wireframe mode. The Geforce drops all the way to 84fps but the quadro only drops to 216fps.

I will try to add more islands to get up to 1,3M tris and see what happens

Some recent hardwares have a very bad support of the polygon mode GL_LINE and draw mode GL_LINES (see the OpenGL FAQ), it is done in software :( I have a Quadro FX here, I do not reproduce this problem ;)
gouessej said:

Some recent hardwares have a very bad support of the polygon mode GL_LINE and draw mode GL_LINES (see the OpenGL FAQ), it is done in software :( I have a Quadro FX here, I do not reproduce this problem ;)


And yet another bit of proof to all the 13 year old gamers who think anyone buying Quadro is wasting their money :)