Page 3 of 7

Re: Cap your fps people (battlenonsense)

Posted: 28 Sep 2019, 12:34
by GFresha
RealNC wrote:
GFresha wrote:RTTS 141 FPS cap + Gsync on + vsync on = how much input lag MS?
The absolute lag depends on the game of course. If you measure different games with these settings, you'll get different results. In some you get 20ms, in some 30ms, or something else.

Note that G-Sync doesn't play much of a role here. The input lag of a 141FPS cap with g-sync ON and vsync OFF or ON is pretty much the same as g-sync OFF and vsync OFF. The difference is only about 1ms, which really doesn't matter.

Also, RTSS doesn't add 1ms (but I think you meant to say 1 frame, not 1 ms.) It is neutral when it comes to input lag. RTSS will give you the natural input lag of the target frame rate in double-buffer output mode. That turns out to be the 1 frame in the front buffer plus a large part of the 1 frame in the back buffer (how large depends on when the game reads player input after RTSS returns control to the game.) What that means is that RTSS neither adds nor reduces input lag. It's neutral. (Of course getting rid of the GPU bottleneck shows a huge input lag reduction, as demonstrated in the video. RTSS will of course have that effect too. But what I'm saying here is that once the GPU bottleneck disappears due to the frame cap, RTSS gives you the natural input lag of your target FPS. Nothing less, nothing more.)
Also what about if riva turner and in game cap is set the same? Which one takes effect?

Re: Cap your fps people (battlenonsense)

Posted: 28 Sep 2019, 13:51
by andrelip
You should measure input lag before cable as the end of input capturing and the present() command. The best case scenario for syncing and low input lag would be when the game stops the input capturing, render the cpu and gpu without any wait and then present the exact moment when the game blanking interval was done.

I think this is not the case of freesync. If you watch that using GPUView you will notice that the present is not in sync with the vblank interrupt even when vsync is enabled. It oscilates a lot and by a large margin. You have a smooth feeling because it have a large blanking interval that is dynamic adjusted but it do not suddenly ends the blanking interval and start the "beam" when the frontporch changes. I've tested that with G-sync in a freesync monitor so it could be different with native G-Sync or even freesync on AMD.

Meanwhile the Scanline Sync is perfect tied to the blanking interval and do not have a frame of latency as most people think. The problem with S-Sync is that it renders the frame as soon as the last one is presented and then it sleeps until it gets closer to present at the desired point. It is a problem because you stopped capturing inputs (from both the user and network) in the beginning of the refresh interval so you have a display lag. At 240hz this will be less than 4ms and very consistent so it worths a lot but in lower rates that will degrade performance. When you limit your fps with in game settings you have the (cpu render + d3d + gpu render + present) so the frame in the front buffer have the minimum latency possible but it is out of sync with the interrupt, do not cover frametime oscilations and could lead to incosistent feeling.

The best of both words would be an algorithm like S-Sync that knows exactly a safe time to start rendering a scene to be able to sleep a safe amount of time before rendering the whole package and then just have a very minimum sleep in the end to sustain a perfect frametime.

Re: Cap your fps people (battlenonsense)

Posted: 28 Sep 2019, 15:44
by RealNC
GFresha wrote:Also what about if riva turner and in game cap is set the same? Which one takes effect?
Both. If the FPS reaches one of the caps, it will be capped by that limiter. The capping of internal limiters vs that of RTSS happens in very different ways, so there is no conflict. Whatever limiter is set to the lowest cap, the game will be limited to that.

Re: Cap your fps people (battlenonsense)

Posted: 28 Sep 2019, 19:27
by GFresha
jorimt wrote:^ As far as I'm aware, that new setting (both "on" and "ultra" variants) is typically only useful for reducing input lag in GPU-limited/maxed usage situations.

Hey jorimt, from what you know so far about ultra low latency, do you know if it causes input lag if its turned on in games where GPU % is not maxed?

For example, Fortnite gives me 40% CPU usage, 40% GPU usage, if I turn on ultra low latency will it actually do anything or will it only harm?

If you can do some sort of input lag test with ultra low latency on non GPU limited scnario, like Fortnite for example that would be great. I know Nvidia said we get 33% input lag improvement in GPU bound, but what about games that are not GPU or CPU bound vs CPU bound?

I know this is dumb and prob won't work but I want to enable ultra low latency just in case I can lower my input lag further, but if my game is not at 99% then I'm afraid it might increase the input lag

Re: Cap your fps people (battlenonsense)

Posted: 28 Sep 2019, 21:35
by jorimt
GFresha wrote:Hey jorimt, from what you know so far about ultra low latency, do you know if it causes input lag if its turned on in games where GPU % is not maxed?
When using an FPS Limit + "Ultra" Low Latency Mode in non-GPU-bound scenarios, vs. an FPS limit + Low Latency Mode "Off" (assuming the FPS limit is the limiting factor for the system FPS at all times), Battle(non)sense test numbers suggested as much in the OP video (starts at 6:16 mark):
https://youtu.be/7CKnJ5ujL_Q?t=376

NVIDIA also states (original article, https://www.nvidia.com/en-us/geforce/ne ... dy-driver/):
With the release of our Gamescom Game Ready Driver, we’re introducing a new Ultra-Low Latency Mode that enables ‘just in time’ frame scheduling, submitting frames to be rendered just before the GPU needs them. This further reduces latency by up to 33%
It's possible this "just in time" function could have negative effects in non-GPU-bound situations, and it's probably safer to have Low Latency Mode set to "On" or "Off" in these scenarios instead.

Also, FYI, NVIDIA states:
Our new Low Latency Mode is being released in beta with support for all GPUs in DX9 and DX11 games (in DX12 and Vulkan titles, the game decides when to queue the frame).
Finally, as with the previous standalone MPRF setting, it's effectiveness can vary per game, system, and individual configuration (G-SYNC/no-sync, FPS limit/uncapped, etc). So, as I stated in G-SYNC 101 regarding MPRF, It really still "depends."

For your original posed scenario (using V-SYNC OFF in Fortnite + FPS limit above the refresh rate), so long as your system can sustain framerates above your FPS limit 99% of the time, and the FPS limit is preventing your GPU from maxing out, it looks like Low Latency Mode should be set to "Off" (and at most "On" = MPRF "1") not "Ultra," for the lowest possible input lag.

Re: Cap your fps people (battlenonsense)

Posted: 30 Sep 2019, 08:46
by Zennon
Hardware Unboxed findings on this.
https://www.youtube.com/watch?v=VtSfjBfp1LA

Re: Cap your fps people (battlenonsense)

Posted: 30 Sep 2019, 09:26
by jorimt
^ Interesting, and, if anything, it further confirms what I've repeatedly stated for a while now: when we're talking about the pre-rendered frames queue as well as FPS limiters' impact on input latency, it can all depend on the individual game, the system running it, and the user's general configuration.

That said, Hardware Unboxed was unfortunately rather vague on the specifics of their testing methodology, and did not confirm whether they were using a 240Hz refresh rate across all tests, nor did they confirm whether they were using the V-SYNC option when using adaptive-sync, or whether their framerate was being sustained above the set FPS limit 100% of the time in each scenario (which is the only way input latency can be further reduced with this method in non-synced, GPU-bound scenarios).

Thus it is difficult for us, as the viewers, to fully confirm whether they were running into any form of sync-induced latency in adaptive-sync scenarios, whether their system was more often than not running below their set FPS limits, or some such other (possibly numerous) oversight(s) during their testing, which could (or could not) have had an impact on their results.

That, and unlike Battle(non)sense's video, which was only testing for GPU-bound input lag in non-synced scenarios, some of their test scenarios strayed a bit off subject (edging into input lag differences being possibly caused, in part, by the sheer frametime difference between average FPS), at which point I myself wasn't even certain what they were exactly testing for any more.

As for the input lag reduction difference between the in-game and RTSS FPS limits in some of their test games, this is expected and something I've already explained in my article; good in-game limiters can utilize lower-than-set-FPS-limit frametimes at the engine-level to reduce input lag even further than usually possible, and RTSS can only limit FPS via a set frametime, and thus does not have such an advantage, making it effectively neutral input lag-wise in the majority of (especially non-synced) situations.

Re: Cap your fps people (battlenonsense)

Posted: 30 Sep 2019, 10:16
by RealNC
It is of course dependent on the game. However, HU's conclusion that "using RTSS never improves input lag" is blatantly false. It does improve input lag when the benefits of removing the GPU bottleneck outweigh the drawbacks of lower FPS. I've seen it in plenty of games. It is especially true for severe GPU bottlenecks that occur when playing at 4K.

But most importantly, when using gsync and the game has no in-game limiter, using RTSS to prevent the framerate from exceeding the max refresh rate (as opposed to using vsync uncapped for that) seems to always improve input lag. Either by a lot, or by a little, but never makes it worse. I have never seen even a single game where this is not the case. This was not mentioned in the video.

Re: Cap your fps people (battlenonsense)

Posted: 30 Sep 2019, 11:02
by alapsu
Zennon wrote:Hardware Unboxed findings on this.
https://www.youtube.com/watch?v=VtSfjBfp1LA
The testing HWU performed for that video was a mess. Their testing was totally inconsistent (different set of tests for each game) and they managed to convince themselves they'd at least partially refuted Battlenonsense's claims without actually testing any of those claims. Explanation of the mess below:


Recap on what Battlenonsense found:

----

His testing began with Overwatch. That test shows that when you reduce the GPU load via framerate capping, ultra low latency/anti-lag (ULL/AL, and I'll just use ULL for brevity) mode *adds* to input lag (He also found that when GPU-limited, ULL reduces input lag as advertised).

He then uses PUBG and BFV to confirm this finding. To do this, he compares input lag *with ULL off* when GPU-limited vs. when not GPU-limited.

In all cases, Battlenonsese uses in-game FPS capping. He does not use RTSS for his testing. So in addition to the RTSS claims being false as RealNC explained, they're irrelevant. In other words, HWU is so off-base that they're not even wrong.

Recap of HWU's testing

For Gears 5, HWU tests the following scenarios:

no FPS cap, ULL off
no FPS cap, ULL on
in-game 144 fps cap, ULL off
in-game 144 fps cap, ULL on
RTSS 144 fps cap, ULL off
RTSS 144 fps cap, ULL on

This test shows that in Gears 5:

-an in-game FPS cap reduces input latency
-an RTSS cap does not reduce input latency

This test does not tell us anything about the effect of ULL. Gears 5 runs on DX12, which is not compatible with ULL. This is the only time HWU runs the correct set of tests and actually hits 99% GPU usage, but because it's a DX12 title their results are virtually useless.

HWU then changes things up, testing an appropriate game (BFV) but changing his methodology so that we can learn nothing about Battlenonsense's finding that ULL-on is inferior (in terms of input latency) to ULL-off when not GPU-limited.

no FPS cap, ULL off
no FPS cap, ULL on
no FPS cap, ULL on, future frame rendering off
in-game 120 fps cap, ULL on, future frame rendering off
RTSS 120 fps cap, ULL on, future frame rendering off

These tests do not tell us anything about the effect of ULL (he leaves ULL on for all tests except the first uncapped scenario). They also don't really tell us much about the effect of using frame capping to go from GPU-limited to non-GPU-limited, as he only hits 97% max (this is the upper end of the utilization range where Battlenonsense found the benefits to input latency begin).

For Far Cry 5, HWU's tests a "GPU-limited" scenario where GPU utilization is at 92% max. In the in the voiceover, they even acknowledge that they're not testing a GPU-bound scenario at all. Remember, Battlenonsense's Overwatch testing found the reduction in input latency was achieved by limiting the GPU load to 94-97%. So the 4-5 ms increase in input latency that HWU found when using an FPS cap doesn't actually tell us anything about the effect on input latency when GPU-limited vs. not GPU-limited. And again, since he was never GPU-limited, this tells us nothing about ULL other than providing another confirmation that ULL doesn't reduce input latency when you are not GPU-limited.

HWU's Divison 2 testing again tells us nothing about differences between GPU-limited and non-GPU-limited scenarios or ULL (I don't think he specifies, but I think he's testing in DX12, which doesn't support ULL. Either way, ULL is not included in this testing). The uncapped scenario only hits 97% GPU utilization, which is again within the range where Battlenonsesne reports you start to get improvements in latency by turning ULL off. All HWU has shown is that when not GPU-limited, higher FPS generally results in less latency.

Metro Exodus testing is the same story as the Divison 2's testing. No ULL as it's DX12 and max GPU utiliztion is only 95%.

CSGO - he only tests with ULL on. Again, all this shows is that when you leave ULL on, higher fps generally results in less latency. If he had tested ULL on vs ULL off at each frame cap and frame capping method, we could have learned something useful.

Fortnite testing - again, HWU's testing is never GPU-limited (max utilization is 94%). He has yet again shown that when you leave ULL on, higher fps generally results in less latency.

Re: Cap your fps people (battlenonsense)

Posted: 02 Oct 2019, 00:46
by ko4
alapsu wrote:
Zennon wrote:Hardware Unboxed findings on this.
https://www.youtube.com/watch?v=VtSfjBfp1LA
The testing HWU performed for that video was a mess. Their testing was totally inconsistent (different set of tests for each game) and they managed to convince themselves they'd at least partially refuted Battlenonsense's claims without actually testing any of those claims. Explanation of the mess below:


Recap on what Battlenonsense found:

----

His testing began with Overwatch. That test shows that when you reduce the GPU load via framerate capping, ultra low latency/anti-lag (ULL/AL, and I'll just use ULL for brevity) mode *adds* to input lag (He also found that when GPU-limited, ULL reduces input lag as advertised).

He then uses PUBG and BFV to confirm this finding. To do this, he compares input lag *with ULL off* when GPU-limited vs. when not GPU-limited.

In all cases, Battlenonsese uses in-game FPS capping. He does not use RTSS for his testing. So in addition to the RTSS claims being false as RealNC explained, they're irrelevant. In other words, HWU is so off-base that they're not even wrong.

Recap of HWU's testing

For Gears 5, HWU tests the following scenarios:

no FPS cap, ULL off
no FPS cap, ULL on
in-game 144 fps cap, ULL off
in-game 144 fps cap, ULL on
RTSS 144 fps cap, ULL off
RTSS 144 fps cap, ULL on

This test shows that in Gears 5:

-an in-game FPS cap reduces input latency
-an RTSS cap does not reduce input latency

This test does not tell us anything about the effect of ULL. Gears 5 runs on DX12, which is not compatible with ULL. This is the only time HWU runs the correct set of tests and actually hits 99% GPU usage, but because it's a DX12 title their results are virtually useless.

HWU then changes things up, testing an appropriate game (BFV) but changing his methodology so that we can learn nothing about Battlenonsense's finding that ULL-on is inferior (in terms of input latency) to ULL-off when not GPU-limited.

no FPS cap, ULL off
no FPS cap, ULL on
no FPS cap, ULL on, future frame rendering off
in-game 120 fps cap, ULL on, future frame rendering off
RTSS 120 fps cap, ULL on, future frame rendering off

These tests do not tell us anything about the effect of ULL (he leaves ULL on for all tests except the first uncapped scenario). They also don't really tell us much about the effect of using frame capping to go from GPU-limited to non-GPU-limited, as he only hits 97% max (this is the upper end of the utilization range where Battlenonsense found the benefits to input latency begin).

For Far Cry 5, HWU's tests a "GPU-limited" scenario where GPU utilization is at 92% max. In the in the voiceover, they even acknowledge that they're not testing a GPU-bound scenario at all. Remember, Battlenonsense's Overwatch testing found the reduction in input latency was achieved by limiting the GPU load to 94-97%. So the 4-5 ms increase in input latency that HWU found when using an FPS cap doesn't actually tell us anything about the effect on input latency when GPU-limited vs. not GPU-limited. And again, since he was never GPU-limited, this tells us nothing about ULL other than providing another confirmation that ULL doesn't reduce input latency when you are not GPU-limited.

HWU's Divison 2 testing again tells us nothing about differences between GPU-limited and non-GPU-limited scenarios or ULL (I don't think he specifies, but I think he's testing in DX12, which doesn't support ULL. Either way, ULL is not included in this testing). The uncapped scenario only hits 97% GPU utilization, which is again within the range where Battlenonsesne reports you start to get improvements in latency by turning ULL off. All HWU has shown is that when not GPU-limited, higher FPS generally results in less latency.

Metro Exodus testing is the same story as the Divison 2's testing. No ULL as it's DX12 and max GPU utiliztion is only 95%.

CSGO - he only tests with ULL on. Again, all this shows is that when you leave ULL on, higher fps generally results in less latency. If he had tested ULL on vs ULL off at each frame cap and frame capping method, we could have learned something useful.

Fortnite testing - again, HWU's testing is never GPU-limited (max utilization is 94%). He has yet again shown that when you leave ULL on, higher fps generally results in less latency.
Yep, something must be wrong because I can literally feel the input lag. I could feel it on my old rig and on my completely different new rig.
Maybe its hardware dependent? More testing needs to be done tbh.
Also not sure how hes getting around 22ms in CSGO
Last time I tested my csgo input lag I was getting 9-13 ms and that was with mouse movement