Page 3 of 4

Re: V-Sync: Driver or Game?

Posted: 11 Apr 2015, 04:45
by Sparky
MrBonk wrote:
Sparky wrote:Driver vs game doesn't matter for v-sync, but it does matter for framerate caps. In game framerate caps are always going to be better than a driver based cap, because if the game engine is waiting on the driver, that's unnecessary latency being added.

If you use v-sync without a framrate cap, and your vsync off framerate is over your refresh rate, you're going to get a lot of input lag(multiply your refresh interval by about 5 for an estimated minimum).
I beg to differ for the most part. I've experienced several instances where in game vsync is a crapshoot entirely in terms of stability. Same with frame caps. A lot are unstable. Some aren't. But a lot are.
You're going to need to be more precise about what you mean by v-sync being unstable. Are you talking about the difference between double and triple buffering when average framerate is under the refresh rate?

Re: V-Sync: Driver or Game?

Posted: 11 Apr 2015, 06:30
by Glide
Sparky wrote:As far as v-sync is concerned 70fps is identical to 90fps, If your pipeline is allowed to push frames faster than your monitor accepts them, your graphics pipeline gets congested and you see a huge increase in latency. As for nvidia's driver cap vs RTSS, maybe flood or stirner can test that.
Oh. I was under the impression that what happened was simply that your input was recorded at the very beginning of the frame (e.g. the first 1.67ms) so at 60Hz you would have ~15ms latency between your input and seeing it on-screen.

And if you could implement a frame cap, it would delay the input so that it's captured at the last possible moment before the frame is rendered, minimizing the difference between input/render and updating the display.

But since it adds a frame of latency, it seemed that you might be better off just using V-Sync if your framerate is close enough to your refresh rate.
Once you get to a certain point beyond your refresh rate it seems like a cap would be important, but not if you're only running at say 70 FPS uncapped.

70 FPS would be a frame time of 14.3ms, so it would only be adding 2.4ms latency, while an external cap would be adding a full frame.
Or am I completely misunderstanding the issue?
Sparky wrote:I don't have an Nvidia card to test right now, but a driver based cap cannot be lower latency than an in-game cap. There's also a tradeoff between hiding stutter(stable frame times) and minimizing latency(eliminating buffering).
So in-game V-Sync should be lower latency then?
I assumed that would be the case, but am unable to test this.

Unfortunately most games are lacking an option to cap the framerate.
Sparky wrote:You're going to need to be more precise about what you mean by v-sync being unstable. Are you talking about the difference between double and triple buffering when average framerate is under the refresh rate?
I assume he is referring to frame times. Frame times are generally more consistent using the driver v-sync compared to in-game v-sync in many games.
Of course some game engines do a perfectly good job presenting consistent frame times with their in-game V-Sync option.

Re: V-Sync: Driver or Game?

Posted: 11 Apr 2015, 07:43
by Sparky
Glide wrote:
Sparky wrote:As far as v-sync is concerned 70fps is identical to 90fps, If your pipeline is allowed to push frames faster than your monitor accepts them, your graphics pipeline gets congested and you see a huge increase in latency. As for nvidia's driver cap vs RTSS, maybe flood or stirner can test that.
Oh. I was under the impression that what happened was simply that your input was recorded at the very beginning of the frame (e.g. the first 1.67ms) so at 60Hz you would have ~15ms latency between your input and seeing it on-screen.
not quite. If the explanation below isn't detailed enough, try this: http://forums.blurbusters.com/viewtopic ... 5284[quote]

And if you could implement a frame cap, it would delay the input so that it's captured at the last possible moment before the frame is rendered, minimizing the difference between input/render and updating the display.

But since it adds a frame of latency, it seemed that you might be better off just using V-Sync if your framerate is close enough to your refresh rate.
Once you get to a certain point beyond your refresh rate it seems like a cap would be important, but not if you're only running at say 70 FPS uncapped.
[/quote]If you're using v-sync, The certain point beyond which it's important to cap framerate is exactly your refresh rate.
70 FPS would be a frame time of 14.3ms, so it would only be adding 2.4ms latency, while an external cap would be adding a full frame.
Or am I completely misunderstanding the issue?
You're misunderstanding the issue. When the computer is rendering a frame, it does it in several steps that run in sequence on different bits of hardware(think of a factory assembly line, where each step along the line does something different). Your framerate, at any given time, is limited by exactly one of those steps, and can be working on several frames. If that bottleneck is at the start of the pipeline, then you get your graphics latency just by adding the actual calculation time for each individual step. If the bottleneck is at the end of the pipeline, then every step before it has to spend exactly the same amount of time as the slowest stage in the pipeline, because it can't start working on the next frame until it passes the current one on to the next stage. fps_max limits framerate at the first stage of that pipeline, Rivatuner limits it at the second stage of the pipeline, Radeonpro DFC limits it at the third stage of the pipeline, and v-sync limits it at the last stage of the pipeline. Triple buffering adds one extra stage with zero calculation time, so that the stage before it can keep working instead of waiting for the next refresh. (flip queue size/render ahead adds extra buffers between the CPU and the GPU, this smooths out games that are usually GPU limited, but have game engine hitches, but it adds latency unless you're cpu limited)
Sparky wrote:I don't have an Nvidia card to test right now, but a driver based cap cannot be lower latency than an in-game cap. There's also a tradeoff between hiding stutter(stable frame times) and minimizing latency(eliminating buffering).
So in-game V-Sync should be lower latency then?
in-game framerate cap would be lower latency, in game v-sync would not.
I assumed that would be the case, but am unable to test this.

Unfortunately most games are lacking an option to cap the framerate.
Sparky wrote:You're going to need to be more precise about what you mean by v-sync being unstable. Are you talking about the difference between double and triple buffering when average framerate is under the refresh rate?
I assume he is referring to frame times. Frame times are generally more consistent using the driver v-sync compared to in-game v-sync in many games.
Of course some game engines do a perfectly good job presenting consistent frame times with their in-game V-Sync option.
Consistent frame times are very easy if you have 50ms+ of buffering, but it's hell for input lag. If you look at the graph I posted on the last page, fps_max 85 and everything right of it is displaying basically the same framerate/frametimes. The capped ones do occasionally miss a v-sync deadline, which should show up on a frametime graph.

Re: V-Sync: Driver or Game?

Posted: 12 Apr 2015, 02:26
by MrBonk
Sparky wrote:
MrBonk wrote:
Sparky wrote:Driver vs game doesn't matter for v-sync, but it does matter for framerate caps. In game framerate caps are always going to be better than a driver based cap, because if the game engine is waiting on the driver, that's unnecessary latency being added.

If you use v-sync without a framrate cap, and your vsync off framerate is over your refresh rate, you're going to get a lot of input lag(multiply your refresh interval by about 5 for an estimated minimum).
I beg to differ for the most part. I've experienced several instances where in game vsync is a crapshoot entirely in terms of stability. Same with frame caps. A lot are unstable. Some aren't. But a lot are.
You're going to need to be more precise about what you mean by v-sync being unstable. Are you talking about the difference between double and triple buffering when average framerate is under the refresh rate?
Unstable as in frame times are extremely variable resulting in unstable motion, microstuttering, full on stuttering, and generally very poor frame pacing. You might be running 60FPS or 120FPS, but it's not going to look or feel smooth because of the stability problems.

Here's a recent example, a 2005 game from Nihon Falcom was recently released on Steam called "Gurumin". The 3D rendering thread is locked to 30FPS, the UI thread is unlocked and Vsync is enabled by default. Showing a reading of 60FPS. (If you disable the vsync in game, forcing Vsync doesn't work and the UI thread can go into the hundreds. But vsync can be overrided when it is enabled) But it's extremely unstable, there is a ton of duplicate frames being spit out at the wrong intervals, and extremely variable frametimes. The framerate is still reported as 59/60, but motion is unstable because of the above.

You can fix it by forcing Vsync via the driver over the in game vsync, + pre-rendered frames 1 to reduce latency for the 30FPS thread, and capping the frame rate to 30. Or you can use 1/2 refresh+30FPS cap and Pre-render 1. In this case both result in a much more visually stable experience than without. (And in the case of using 60hz refresh+ a 30FPS cap rather than 1/2+30cap. There isn't stuttering because of it. In a lot of other games, it will produce some microstuttering and judder unless you use 1/2 refresh. Ex Mafia II. )


The caveat to this is that all FMVs (OP/ED) render at half speed now. But luckily you can opt to skip them.

Re: V-Sync: Driver or Game?

Posted: 12 Apr 2015, 03:15
by Sparky
MrBonk wrote:
Sparky wrote:
MrBonk wrote:
Sparky wrote:Driver vs game doesn't matter for v-sync, but it does matter for framerate caps. In game framerate caps are always going to be better than a driver based cap, because if the game engine is waiting on the driver, that's unnecessary latency being added.

If you use v-sync without a framrate cap, and your vsync off framerate is over your refresh rate, you're going to get a lot of input lag(multiply your refresh interval by about 5 for an estimated minimum).
I beg to differ for the most part. I've experienced several instances where in game vsync is a crapshoot entirely in terms of stability. Same with frame caps. A lot are unstable. Some aren't. But a lot are.
You're going to need to be more precise about what you mean by v-sync being unstable. Are you talking about the difference between double and triple buffering when average framerate is under the refresh rate?
Unstable as in frame times are extremely variable resulting in unstable motion, microstuttering, full on stuttering, and generally very poor frame pacing. You might be running 60FPS or 120FPS, but it's not going to look or feel smooth because of the stability problems.

Here's a recent example, a 2005 game from Nihon Falcom was recently released on Steam called "Gurumin". The 3D rendering thread is locked to 30FPS, the UI thread is unlocked and Vsync is enabled by default. Showing a reading of 60FPS. (If you disable the vsync in game, forcing Vsync doesn't work and the UI thread can go into the hundreds. But vsync can be overrided when it is enabled) But it's extremely unstable, there is a ton of duplicate frames being spit out at the wrong intervals, and extremely variable frametimes. The framerate is still reported as 59/60, but motion is unstable because of the above.

You can fix it by forcing Vsync via the driver over the in game vsync, + pre-rendered frames 1 to reduce latency for the 30FPS thread, and capping the frame rate to 30. Or you can use 1/2 refresh+30FPS cap and Pre-render 1. In this case both result in a much more visually stable experience than without. (And in the case of using 60hz refresh+ a 30FPS cap rather than 1/2+30cap. There isn't stuttering because of it. In a lot of other games, it will produce some microstuttering and judder unless you use 1/2 refresh. Ex Mafia II. )


The caveat to this is that all FMVs (OP/ED) render at half speed now. But luckily you can opt to skip them.
You're confusing v-sync with other options the driver can force. Pre-rendered frames doesn't reduce latency, it ADDS latency. It sounds like the game engine is missing v-sync deadlines though, which might make the extra buffering from render ahead better than doing nothing, otherwise you're mixing a lot more 50ms and 17ms frames in with your 33ms frames, in terms of time displayed on screen. The good news is that you aren't bottlenecked by the display, so you don't get the 5+ frames of v-sync latency you'd get from an uncapped game engine trying to run above the refresh rate. Basically, forcing v-sync on with the driver changes nothing if it's still double buffered v-sync. Forcing render ahead DOES change something, but that's not a v-sync change, that's a change in the pipeline before the GPU, which can be impacted by backpressure if you're using double buffered v-sync.

You would probably get similar results by just forcing triple buffering instead of everything else you did, and it wouldn't come with the FMV problems you had. And no, I don't consider triple buffering to be "driver" v-sync, because that option can be set by the game(though sometimes the game developer doesn't expose that option to the player).

Re: V-Sync: Driver or Game?

Posted: 12 Apr 2015, 11:12
by Glide
Sparky wrote:Consistent frame times are very easy if you have 50ms+ of buffering, but it's hell for input lag. If you look at the graph I posted on the last page, fps_max 85 and everything right of it is displaying basically the same framerate/frametimes. The capped ones do occasionally miss a v-sync deadline, which should show up on a frametime graph.
Check the results posted at the beginning of this topic. I only get good frame times when I limit to 1 max pre-rendered frame. If I allow the game to set it, or set it to 8, I get awful frame pacing which results in a lot of stuttering.
Sparky wrote:You're confusing v-sync with other options the driver can force. Pre-rendered frames doesn't reduce latency, it ADDS latency.
No, it can definitely reduce latency:
Image

And in some cases, increasing the maximum number of pre-rendered frames adds a whole frame of latency each time.
Sparky wrote:You would probably get similar results by just forcing triple buffering
Is there a way of doing this other than D3D Overrider? I thought that usually just added latency because it wasn't implementing proper triple buffering.

Re: V-Sync: Driver or Game?

Posted: 12 Apr 2015, 14:45
by Glide
I did some more testing, though I don't know how relevant the results will be to modern games.
I loaded up Max Payne 2, which uses DirectX 8, because it easily runs at hundreds of FPS and if you set it to a maximum of 8 pre-rendered frames it's the laggiest thing in the world.

If I enable an RTSS frame cap of 59/60 FPS, most of that lag is eliminated. 58 FPS in RTSS means that it misses sync and drops to 30 FPS.
If I use the NVIDIA frame cap options of 58/59/60 FPS it greatly reduces the lag - though perhaps not quite as much as RTSS.
So far, that's mostly what I expected.

But the problem is stuttering. Every few seconds it would miss a frame and because I left standard V-Sync enabled, that meant it would drop to 30 FPS.

At 61 FPS in RTSS, it must be right on the threshold of fixing the latency problems because it fluctuates between low latency and noticeably laggy. And the cap has to be raised to 62 FPS to eliminate the stuttering, which means that it's always laggy - though it's still a lot better than uncapped.


However, if I reduce the maximum number of pre-rendered frames to 1 and remove the cap, latency is already about the same as 8 pre-rendered frames with a 59 FPS cap.
If I add a 62 FPS cap in RTSS, or set the driver cap to 65 FPS (it jumps from 60 to 65, though I'm not sure whether you can set a custom limit) it now seems to be better than 8 pre-rendered frames with the 59 FPS cap for latency and it never misses a frame.

If I drop the cap to 59 FPS latency does seem to get lower still, but at that point it's back to skipping a frame every few seconds, and the difference in latency is minimal.

When I'm playing single-player games for enjoyment rather than competitive gaming, low latency is important, but keeping things absolutely smooth matters even more.


So it does seem that a frame cap can help improve latency, but in all of the games I have tested so far, the main thing with V-Sync on is to reduce the maximum number of pre-rendered frames to 1. And this usually helps improve frame pacing as well.

Of course this is all subjective - I'd love to see some hard data on this if you're able to test it.
But even so, running the game at 8 pre-rendered frames really magnifies any differences that there may be because it has so much latency by default.

Re: V-Sync: Driver or Game?

Posted: 12 Apr 2015, 15:47
by Sparky
Glide wrote:
Sparky wrote:Consistent frame times are very easy if you have 50ms+ of buffering, but it's hell for input lag. If you look at the graph I posted on the last page, fps_max 85 and everything right of it is displaying basically the same framerate/frametimes. The capped ones do occasionally miss a v-sync deadline, which should show up on a frametime graph.
Check the results posted at the beginning of this topic. I only get good frame times when I limit to 1 max pre-rendered frame. If I allow the game to set it, or set it to 8, I get awful frame pacing which results in a lot of stuttering.
Sparky wrote:You're confusing v-sync with other options the driver can force. Pre-rendered frames doesn't reduce latency, it ADDS latency.
No, it can definitely reduce latency:
Dunno why you say that it reduces latency, the difference between 2 3 and 4 is within the error bars for the first graph, and they're all over the latency of 1.

SEM(for v-sync off fps_max 60):
1: 219.7063625
2: 276.3161602
3: 205.3481035
4: 273.8068435

If you want to know WHY there's no significant difference, it's because fps_max 60 forces you into a cpu limited scenario, and "max prerendered frames" sets the maximum size of a buffer that comes after the CPU. As soon as the CPU drops something in that buffer, the GPU instantly grabs it and starts working on it. If you had 2 frames sitting in that buffer at 60fps, your input latency would be over 33ms.
And in some cases, increasing the maximum number of pre-rendered frames adds a whole frame of latency each time.
That's the expected behavior in a GPU or v-sync bound situation, and it's why I said more prerendered frames add latency. It looks like settings above 2 are ignored, otherwise latency would keep going up with more prerendered frames.

Sparky wrote:You would probably get similar results by just forcing triple buffering
Is there a way of doing this other than D3D Overrider? I thought that usually just added latency because it wasn't implementing proper triple buffering.
The implementation of triple buffering should be the same regardless of where you turn it on. If your framerate is limited by v-sync, it does add input latency, othewise, not so much: https://docs.google.com/spreadsheets/d/ ... nteractive

Dunno what GPU you have, but radeonpro also lets you force triple buffering. In the above graph I just turned it on in game.

Re: V-Sync: Driver or Game?

Posted: 12 Apr 2015, 16:47
by Sparky
Glide wrote:I did some more testing, though I don't know how relevant the results will be to modern games.
I loaded up Max Payne 2, which uses DirectX 8, because it easily runs at hundreds of FPS and if you set it to a maximum of 8 pre-rendered frames it's the laggiest thing in the world.

If I enable an RTSS frame cap of 59/60 FPS, most of that lag is eliminated. 58 FPS in RTSS means that it misses sync and drops to 30 FPS.
If I use the NVIDIA frame cap options of 58/59/60 FPS it greatly reduces the lag - though perhaps not quite as much as RTSS.
So far, that's mostly what I expected.

But the problem is stuttering. Every few seconds it would miss a frame and because I left standard V-Sync enabled, that meant it would drop to 30 FPS.

At 61 FPS in RTSS, it must be right on the threshold of fixing the latency problems because it fluctuates between low latency and noticeably laggy. And the cap has to be raised to 62 FPS to eliminate the stuttering, which means that it's always laggy - though it's still a lot better than uncapped.


However, if I reduce the maximum number of pre-rendered frames to 1 and remove the cap, latency is already about the same as 8 pre-rendered frames with a 59 FPS cap.


If I add a 62 FPS cap in RTSS, or set the driver cap to 65 FPS (it jumps from 60 to 65, though I'm not sure whether you can set a custom limit) it now seems to be better than 8 pre-rendered frames with the 59 FPS cap for latency and it never misses a frame.
If it's not missing frames, the cap isn't doing anything. That would be identical to the latency for whatever v-sync PRF combo you were using uncapped. There is a caveat here in that a framerate cap very close to your refresh rate can take longer to creep into a high or low latency region, but a cap very slightly higher than your refresh rate will end up in the same place as no cap at all.

If I drop the cap to 59 FPS latency does seem to get lower still, but at that point it's back to skipping a frame every few seconds, and the difference in latency is minimal.

When I'm playing single-player games for enjoyment rather than competitive gaming, low latency is important, but keeping things absolutely smooth matters even more.


So it does seem that a frame cap can help improve latency, but in all of the games I have tested so far, the main thing with V-Sync on is to reduce the maximum number of pre-rendered frames to 1. And this usually helps improve frame pacing as well.

Of course this is all subjective - I'd love to see some hard data on this if you're able to test it.
But even so, running the game at 8 pre-rendered frames really magnifies any differences that there may be because it has so much latency by default.
prerendered frames shouldn't really do anything to frame pacing, except provide some buffer if the cpu portion of the graphics pipeline freezes for a bit.

I did just test flip queue 0, 1, 2, 3, and 5 in CS:GO with double buffered v-sync and no cap(FQ via radeonpro V-sync and double buffering in game). 1 reduced lag by 1 frame, 0,2,3,5 did nothing. I added the data from 1 to the graph

Re: V-Sync: Driver or Game?

Posted: 12 Apr 2015, 17:23
by Glide
I'm not sure that I understand what point you are trying to make here.
All of the discussion here is about a V-Sync On scenario, where the goal is to get the smoothest gameplay possible.
Keeping things low latency at the same time is nice, but not the primary concern.

And you seem to be contradicting yourself in the same post.
Sparky wrote:Dunno why you say that it reduces latency, the difference between 2 3 and 4 is within the error bars for the first graph, and they're all over the latency of 1.
The upper graph shows that the max pre-rendered frames makes no difference to latency when V-Sync is disabled.
However, the lower graph shows that setting the max pre-rendered frames to 1 (the default is 3) removes a frame of latency when V-Sync is enabled - which is why I said that it reduces latency.
And in my own testing, reducing this to 1 also improves frame pacing in most games.

In CS:GO, latency seems to be the same for all values >1.
In other games, each pre-rendered frame may add an additional frame of latency.

In the same post, and the subsequent post, you do seem to agree that setting it to 1 reduces latency??
Sparky wrote:If it's not missing frames, the cap isn't doing anything. That would be identical to the latency for whatever v-sync PRF combo you were using uncapped. There is a caveat here in that a framerate cap very close to your refresh rate can take longer to creep into a high or low latency region, but a cap very slightly higher than your refresh rate will end up in the same place as no cap at all.
I guess I just don't understand the mechanics of this.
I thought the point of the cap was to push the render time to the last possible moment before it is sent to the display, to reduce latency.

Using Max Payne 2 as the example again, it's running at over 400 FPS (with 8xSGSSAA enabled)
So without a cap, the frame is rendered in the first 2.5ms and there's 14.2ms latency while it waits for the next refresh.
I thought the point of the cap was to push the render time towards the end of that 16.67ms period, to minimize the delay between your input and the screen updating.

If you have access to that game, I suggest you try it rather than making assumptions.
Implementing a cap, whether that's 58 FPS or 65 FPS (for 60Hz V-Sync) makes a huge difference to latency when it's set to 8 pre-rendered frames.
65 FPS does not reduce latency as much as 58 FPS (I guess there is an additional frame) but it is still a significant reduction from running uncapped.

I'm not sure how you are supposed to get smooth gameplay if you are capping the framerate below the refresh rate.
That just results in tearing (adaptive v-sync) the frame-rate dropping to 30 FPS (regular v-sync) or stuttering. (triple-buffering)
All of which go against the intended goal here of having the smoothest gameplay possible.

And if that's the case, why not just run with V-Sync switched off at the highest framerate you can?
Sparky wrote:prerendered frames shouldn't really do anything to frame pacing, except provide some buffer if the cpu portion of the graphics pipeline freezes for a bit.
Well it makes a huge difference in a number of games/engines. That is why I started this topic, if you read the first post.

Here are the results when the game is allowed to set the max number of pre-rendered frames, and I used an RTSS cap since people said that may improve frame-pacing. (it did nothing)
And these are the results when the max number of pre-rendered frames is set to 1.
Sparky wrote:I did just test flip queue 0, 1, 2, 3, and 5 in CS:GO with double buffered v-sync and no cap(via radeonpro). 1 reduced lag by 1 frame, 0,2,3,5 did nothing. I added the data from 1 to the graph
Zero allows the application to specify what value to use. The default is 3 if it does not specify anything. That's why NVIDIA replaced "0" with "Use the 3D application setting".
Sparky wrote:The implementation of triple buffering should be the same regardless of where you turn it on. If your framerate is limited by v-sync, it does add input latency, othewise, not so much: https://docs.google.com/spreadsheets/d/ ... nteractive
I'm sorry, but the labels on your graph are quite cryptic.
If I am reading this correctly, in-game triple buffering has a frame of latency (I assume - the scale is useless) lower than forcing it externally via RivaTuner? You also mention that RadeonPro has an additional frame of latency over RivaTuner.

This suggests that the implementation differs depending on where it is set.
Sparky wrote:Dunno what GPU you have, but radeonpro also lets you force triple buffering. In the above graph I just turned it on in game.
I am using a GTX570 just now. NVIDIA only have the option to force triple-buffering for OpenGL.
I was under the impression that D3D Overrider's "triple buffering" increases the flip queue size, which is not the same thing as true triple-buffering - hence the additional frame of latency.