Discussion about fps and refresh rate, would a 144hz monitor make a difference over 120hz with anything less than 120fps?

Hello all,

I see a lot of people wanting these 144hz Freesync monitors that are all amazing and expensive, but I notice a lot of people do not get very high FPS, but want these monitors.

From what I read the GPU serves up frames for the monitor to refresh, so I don’t get how if you have 70 fps, that a 144hz monitor is going to do anything over a 120hz monitor?

I was reading this forum thread http://www.tomshardware.com/forum/id-2234785/144hz-display-makes-sense-60fps-gaming.html

and the “best answer” was saying it will stop “motion blur” and have “less input lag” if he got 144hz, which I believe is false.

For 1, if you look at the Input Lag DB http://www.displaylag.com/display-database/ most of them are 1080p 60hz monitors with the lowest input lag,

and 2, I don’t get how “motion blur” would be lessened with a faster refreshing monitor… If you have 70 fps, it should refresh each frame about 2 times with 144hz right? So how would that matter 120hz vs 144hz? I don’t get how there would be less “motion blur” if there isn’t more frames being served up?

If we had super high FPS I understand that the higher hz is better, but lower fps than refresh rate…??

Thoughts on this team?

Thanks :smile:

Disclaimer: I don’t know what I am talking about.

About the blur, I think in general 144hz monitors will have faster response time than those with a slower refresh rate, just because they’re higher end products, so this would reduce the likely hood of blurred images.

Oh, another shot in the dark here, but the Freezync technology adaps the refresh rate to your FPS I think… So potentially, your computer might render a frame and have it ready, but without said technology you have to wait till the next refresh, whereas the free sync will try to ensure that the time at which the monitor is ready to refresh will be when the next frame is ready. So even with lower FPS than refresh rate it’s going to be more steady and ensure there is no gap between render and refresh. I guess?

1 Like

…if refresh rate of the monitor is higher than graphics card FPS, then your gameplay experience will be free from ‘tearing’, and probably whole image will look sharper…other thing some people may notice, is input lag … 60Hz display, will not have a visible input lag below 16.67ms, because that’s the amount of time which passes from one refresh to the next. A 120Hz display goes down to 8.33ms, and a 240Hz display down to 4.16ms…

1 Like

Thanks. Usually the sync tech is for higher FPS > Refresh RAte, so that the RR can catch up to the FPS. I would assume lower FPS wouldn’t matter, and would just create multiple frames in one, which could eventually lead to choppy play.

Thanks Ecco, much appreciated on the info.

I’m still wondering if people will notice a difference in the different refresh rates, if the FPS cannot get high enough for a few of them. I don’t think so overall, since the fps is what matters overall. A low FPS will have a screen refreshing frames multiple times, which could create chopping at a low enough FPS.

…FPS is what you want to make sure is smooth and high as possible…once display refresh rate surpass FPS twice and more times, you will start noticing input lag and basically, be able to feel if game logic/rendering is indeed slower than monitor…

Actually, in full screen tearing can happen any time that vsync is not used. The number of tears and how consistently placed is the only thing affected by differences between monitor refresh and game frame rate. 1:1 and you can get a pretty consistent tear right in the middle of the screen and it’s nasty.

When the monitor’s refresh is higher, the tears may only happen every few frames (every other frame in the 120 hz vs 60 FPS case).

It also largely depends on how buffering is done between the game, the OS, and the GPU. I either don’t notice tearing anymore or it doesn’t happen in windowed mode, for example. I can never run full screen because it kills the SDK so I don’t know if I’ve noticed tearing there or not. I usually run ‘for real’ with vsync on anyway.

2 Likes

Hmm, thanks for the info. I was curious how it worked with lower FPS. I thought I read something about there being 2 buffers and the primary holds the finished frame, while the secondary creates the next one. I guess maybe a frame could get passed that wasn’t fully rendered?

I also read up that “Triple Buffering” helps with the screen tearing, as well as some issues with V-Sync in general.

I usually run in full screen with no real issues, except I do have some issues with overlapping textures/objects where the objects keep trying to render over each other, since they share the same Z-Coordinate(s). Even with GeometryBatch Optimization it still bugs out. I would think that by them being combined geometries it would stop. But that optimization brings some other weird things into the mix :stuck_out_tongue: .

I was reading up some good info on this site http://www.tweakguides.com/Graphics_1.html

I’m curious your thoughts on it?

Yes, triple buffering prevents tearing because only complete frames are transferred to the screen. I think this is effectively what is happening in windowed mode.

a) even that may not help.
b) it has nothing to do with refresh rates and 100% due to your scene.

Yeah, wasn’t sure if it would be labeled as “Screen tearing” or not. I guess the proper term (which I couldn’t think of before), would be “flickering…?”

What I see it’s called “Z-Fighting,” and I found this video that explains it, so going to see if I can fix it with that…

Good discussion :smiley:

Yes, Z-fighting - has to do with the Z-Buffer (might as well look that up on Wikipedia or Youtube). There are only 4Giga (32 bit) possible depth values. And if two nearby geometries fall into the same depth value, then it is not clear who will win the Z-fight (and you get a zigzag pattern).

Last time I saw this, was my little game from 2006. It often happens with things very far away from the camera, becaus most Z-Buffer implementations are non-linear and the Z precision decreases at greater distances to the eye point.

Freesync and the other technique - exactly what @JESTERRRRRR said - try to grab picture when it’s ready (similar to VSync, but more advanced).

Both AMD and NVIDIA are having something like that almost ready. One of those techniques needs a 100 $ hardware module in the monitor and the other technique can run with DisplayPort … 1.2 or 2.0 (don’t remember exactly). Interesting thing…

Also, some of these 140Hz monitors can be used for stereo 3D (3D goggles). That would be a definitve pro argument if I would select such hardware to buy.