StandardGame FPS - strange things are happening

i am using standardgame for the third time now. good job darkfrog btw.



overall, when a game performs better than the framerate vsync would limit it to, vsync limits it to 60 FPS … just as wanted.



now i have a game that performs as follows:



16 depthBits, vSync = false  |  48 FPS (constant)

32 depthBits, vSync = false  |  58 FPS (constant)

  • depthBits, vSync = true  |  30 FPS (constant)



    framerate is always set to -1, frequency is always set to 60,

    nothing else changes.



    Shouldn't higher depth bits perform worse? Shouldn't vSync only limit the FPS when more than the framerate?



    thanks in advance.

    dhdd

That's odd…not sure why that's happening. :o



It's been a while since I've touched StandardGame. It's possible there's a logic bug in there.



Take a look at the code in StandardGame that handles this and see if there's a potential other limiter.

What kind of monitor are you on?  LCD or CRT, vsync is kinda different than LCDs.  Also, this may be a FPS limit of the GFX chip itself (with vSync enabled).



http://www.hardforum.com/showthread.php?t=928593



Actually, after reading this your FPS makes perfect sense to me…

dhdd said:

Shouldn't higher depth bits perform worse?


Is your desktop in 32bit depth? Your textures is 32bit depth?
The "lag" is on convert the depth color not the amount of bits anymore. Its valid for modern VGAs. Older were more in trouble with higher color depth.
Maybe its a little more complex than im trying to explain.

basixs said:

What kind of monitor are you on?  LCD or CRT, vsync is kinda different than LCDs.  Also, this may be a FPS limit of the GFX chip itself (with vSync enabled).

http://www.hardforum.com/showthread.php?t=928593

Actually, after reading this your FPS makes perfect sense to me...


[offtopic]
Very interesting link! Never knew about the ratio's.. Also explains why some games only had 30FPS constantly, while it should've been much higher
[/offtopic]

Yeah, I agree; I learned a lot just by reading it.  Maybe I should post a copy on the Wiki under the Resource section…

thank you guys, and especially basixs!



the link describes exactly my problem! now i got a ‘problem’: i updated my graphic card driver and now i make 160 FPS  :expressionless: now i cant reproduce anything  :frowning:



can i enable triple buffering with jme?



modified: when i force triple buffering on a graphic card level, i get VERY strange results …



thx,

Andy.

clovis said:

Is your desktop in 32bit depth? Your textures is 32bit depth?
The "lag" is on convert the depth color not the amount of bits anymore. Its valid for modern VGAs. Older were more in trouble with higher color depth.
Maybe its a little more complex than im trying to explain.


you are right, all textures and my desktop are in 32 bits ... the 10 FPS could result from downgrading. thanks
can i enable triple buffering with jme?

I dunno, I don't think so; but maybe in 2.0?