Blur Buster's G-SYNC 101 Series Discussion

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
User avatar
jorimt
Posts: 837
Joined: 04 Nov 2016, 10:44

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 23 Jun 2017, 12:04

RealNC wrote: Because the in-game limiter can reduce the impact of frame latency on total input lag. At 100FPS, frame latency is 10ms. That is part of the total input lag. An in-game limiter however can remove some of that 10ms from the total latency, if the game renders at higher FPS.

Case 1: If the game needs 1ms (1000FPS) to render a frame, a in-game limiter 100FPS cap can potentially decrease total input lag by up to 9ms.

Case 2: If the game renders a frame in 10ms (100FPS), then a 100FPS in-game cap cannot reduce latency further.

RTSS on the other hand, will always behave as if the second case is always true. Regardless of whether the game renders the frame in 1ms or in 10ms, you will get the 10ms frame latency of 100FPS added to the total input lag. That latency is the same as if the game was running uncapped at 100FPS.

If you think about how external, non-predictive frame limiting is done (as RTSS does), it makes sense. The algorithm is quite simple. If the frame time is less than the target frame time (cap), wait until we reached the target frame time. That means regardless of how much time it took to render a frame, the result will always be the same as if the frame took Nms to render, where N is our target frame time. Thus, input lag is the same as if the game was rendering at the target frame rate.

In other words, RTSS is perfectly "neutral." It does not reduce nor increase latency. The in-game limiter reduces latency. (And NVidia's limiter increases latency.)

And, again, that's only "in theory". In practice, RTSS reduces latency in many games due to the frame buffering issue.
Alright, I think I understand a portion of what you're trying to say now...

In-game Framerate Limiter:
Lets frames render at any frametime above the refresh rate. This means a frame can render faster than the set FPS limit, but each frame will be delivered in intervals of the target framerate.

RTSS:
Frametime can not exceed the set FPS limit, which means each frame, at best, will be scanned in after it is rendered.

This would explain why RTSS has tighter frametimes, as frametime = framerate limit, but then if that's the case, I'm not 100% certain why RTSS still delays an entire frame every few frames; e.g. why we're seeing less than 1 frame of delay (+5ms at 144Hz on average).

Unless I'm misunderstanding, this is where you're off, however:
RealNC wrote:Case 1: If the game needs 1ms (1000FPS) to render a frame, a in-game limiter 100FPS cap can potentially decrease total input lag by up to 9ms.
If this were the case, even if you were using G-SYNC + V-SYNC, you'd see a tear. The fastest a single frame can be scanned in from top to bottom is dictated by the static scanout time of the panel. So even with V-SYNC OFF, if you have a 100 FPS in-game limit at 144Hz, and one frame renders in 1ms, it's still going to be delivered in intervals of 100 frames per second, and scan in each frame at 6.9ms, and, with G-SYNC, it must wait for the previous scanout, also 6.9ms, to complete before displaying.
Author: Blur Busters "G-SYNC 101" Series

Display: Acer Predator XB271HU OS: Windows 10 Pro MB: ASUS ROG Maximus X Hero CPU: i7-8700k GPU: EVGA GTX 1080 Ti FTW3 RAM: 32GB G.SKILL TridentZ @3200MHz

User avatar
RealNC
Site Admin
Posts: 2829
Joined: 24 Dec 2013, 18:32
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by RealNC » 23 Jun 2017, 12:23

jorimt wrote:
RealNC wrote: This where you're off, however:
RealNC wrote:Case 1: If the game needs 1ms (1000FPS) to render a frame, a in-game limiter 100FPS cap can potentially decrease total input lag by up to 9ms.
If this were the case, even if you were using G-SYNC + V-SYNC, you'd see a tear. The fastest a single frame can be scanned in from top to bottom is dictated by the static scanout time of the panel. So even with V-SYNC off, if you have a 100 FPS in-game limit at 144Hz, and one frame renders in 1ms, it's still going to be delivered in intervals of 100 frames per second, and scan in at 6.9ms, and, with G-SYNC, it must wait for the previous scanout, also 6.9ms, to complete before displaying.
An in-game limiter can wait until the last possible moment to start rendering. That's why it can further reduce latency. The game can do all other stuff first (calculate physics, render audio, etc, etc.) But the in-game frame limiter can delay reading input from the player and rendering a frame based on that input. The closer to the target frame time the limiter is able to do the rendering, the more latency is shaved off from total input lag.

RTSS cannot do that, for obvious reasons. Even predictive external limiters are just guessing. An in-game limiter has fine control and can do a better job of delaying render. Even an external predictive limiter would delay not just render, but all the other work the game needs to do too along with it; it can either delay everything or nothing.

So it's not like the in-game limiter outputs the frame sooner (which would tear, as you said.) What it does is be able to place the read input + render operation much, much closer to the target frame time, and thus the frame will be based on much fresher input. And that results in latency that is lower than the *apparent* frame render time (as seen from the outside), which is not the true render time. If a game runs at 500FPS, that's a frame time of 2ms. However, only part of those 2ms is actual rendering. Much of it is physics and all the "other stuff" a game needs to do. From RTSS's view, the render time is 2ms. But the in-game limiter knows better. The render time might be 0.5ms,while 1.5ms is spend in other things. And it takes advantage of that to reduce latency to below the natural frame latency of 500FPS.
TwitterSteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
RealNC
Site Admin
Posts: 2829
Joined: 24 Dec 2013, 18:32
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by RealNC » 23 Jun 2017, 12:34

Btw, I should mention here that all of this is based on general knowledge about the matter. I'm not an authority on these things. Someone who actually worked on things that involved these matters should be able to confirm or deny.

Where's Durante when you need him :geek:
TwitterSteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
jorimt
Posts: 837
Joined: 04 Nov 2016, 10:44

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 23 Jun 2017, 12:48

Oh, I gotcha, a misunderstanding then; you're talking about the in-game limiter intercepting before render, I was talking about what it does after render.

I'm nowhere near an expert on framerate limiters myself, and I'd like to learn more going forward; an article on the subject isn't out of the question once I've got everything straight.

As for Durante, he's at NeoGaf :P

Speaking of Durante, even in light of your clarification, I'm hesitant to change my current description of RTSS in my article, because Durante actually made a dedicated thread about my article, and made no note of RTSS wording inaccuracies.
Author: Blur Busters "G-SYNC 101" Series

Display: Acer Predator XB271HU OS: Windows 10 Pro MB: ASUS ROG Maximus X Hero CPU: i7-8700k GPU: EVGA GTX 1080 Ti FTW3 RAM: 32GB G.SKILL TridentZ @3200MHz

User avatar
RealNC
Site Admin
Posts: 2829
Joined: 24 Dec 2013, 18:32
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by RealNC » 23 Jun 2017, 13:01

The "RTSS adds up to 1 frame of latency" statement is not wrong. It's completely correct. It's just the wording that can be misinterpreted.

RTSS does add up to 1 frame of latency compared to a (decent) in-game limiter. It's just that the reason for that is that an in-game limiter reduces latency, while RTSS does not.

The statement is completely true when comparing RTSS to in-game limiters. It's only wrong when you don't compare RTSS to anything and only look at what effect RTSS has on an uncapped framerate; in that case, RTSS does not add any latency. With g-sync, if your game renders at 141FPS on a 144Hz monitor and you're not using a frame capper, it's considered just fine. You get an input lag of N milliseconds. If your game however would render faster, then you can cap to 141FPS with RTSS. That will give you... N milliseconds. Exactly the same!

So it's just from that point of view that the statement "adds up to 1 frame of latency" that can be misinterpreted.

If you ever intend to put these claims to the test, you can always use a "heavy" game with max details. For CS:GO and OW, you'd probably need DSR x4 (for 1440p monitors, that gives you 5K resolution) and have everything maxed. I think for CS:GO, MSAAx8 at 5K would cripple even the 1080 Ti and get you inside the g-sync range uncapped. Although that would be because of the memory bandwidth being 100% saturated, not due to the GPU, and I have no idea if hitting bandwidth limitations can mess with input lag (maxing memory bandwidth does seem to have odd effects on latency.)
TwitterSteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
RealNC
Site Admin
Posts: 2829
Joined: 24 Dec 2013, 18:32
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by RealNC » 23 Jun 2017, 13:20

jorimt wrote:As for Durante, he's at NeoGaf :P
Ah, crap. I just tried to sign up there, but gmail is blocked.
TwitterSteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
jorimt
Posts: 837
Joined: 04 Nov 2016, 10:44

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 23 Jun 2017, 13:25

Well, that's problematic. Do you have an alternate email to use? The aforementioned thread for reference:
http://www.neogaf.com/forum/showthread.php?t=1394718

And yes, I completely understand what you are proposing now, and yes, it would be difficult to accurately isolate. I suppose I could test a much more demanding game featuring an in-game limiter, such as Crysis 3, but, as you say, that could skew results depending on what's limiting performance.

Something to think further on, but you're "RTSS = N milliseconds delay of FPS limit" is equal to "sustained framerate = frametime delay" theory is likely the most plausible conclusion as of now.
Author: Blur Busters "G-SYNC 101" Series

Display: Acer Predator XB271HU OS: Windows 10 Pro MB: ASUS ROG Maximus X Hero CPU: i7-8700k GPU: EVGA GTX 1080 Ti FTW3 RAM: 32GB G.SKILL TridentZ @3200MHz

kevindd992002
Posts: 10
Joined: 14 Jul 2016, 09:55

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by kevindd992002 » 24 Jun 2017, 12:02

jorimt wrote:
kevindd992002 wrote:Yeah, I figured I'm getting the same framerates when I was using a 1070. Fps_max is set at 999 even before you mentioned it. I'm still using a 2600K CPU so I think this is a case of a CPU bottlenecking the GPU.
Ah, that explains it. You omitted the CPU in your initial post, so I assumed you had a more modern CPU to go along with that brand new GPU.

I only have a 1080, but I do have a i7-4770k overclocked to 4.2GHz, and I can easily achieve 600+ framerates with V-SYNC off in an online CS:GO match.

You should consider a CPU upgrade, as your current CPU will seriously bottleneck your GPU in any game. Even an upgrade to my CPU could see your 1080 Ti pushing up to twice the framerates (probably more) most of your games get now.

Knowing this, if you can only achieve 300 FPS on a 144Hz G-SYNC monitor, if we count middle screen (crosshair-level) updates only, you'd be getting an input latency reduction of roughly 1-3ms with V-SYNC OFF over G-SYNC + V-SYNC with a -3 FPS in-game limit.

Whether that's worth it is up to you, and is down to situation and preference; tearing and stutter-free consistency vs. slightly more responsive input w/tearing and microstutter. Whether you use ULMB would also factor in.
Yes, I have most of my watercooling parts for my new build already, I just have to buy a 7700K, motherboard, RAM, and SSD so that I can start building it :)

I'm not sure if this is off topic or not but here it goes. I initially thought it was a CPU bottleneck but then I checked the maximum thread usage while playing CSGO @ 1080p with my 1080Ti. It only reaches around 88% which means that it's not a CPU bottleneck. But then my GPU utilization/load is just effing 30%!

This also happens with GOW4. I get only around 70% GPU usage there. But in Overwatch, everything seems to work correctly (99% GPU load).

All this with VSYNC OFF, of course. Any thoughts?

User avatar
RealNC
Site Admin
Posts: 2829
Joined: 24 Dec 2013, 18:32
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by RealNC » 24 Jun 2017, 12:29

Single-core performance is the most important stat for CS:GO. If it maxes out a single core, you've hit the wall.
TwitterSteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
jorimt
Posts: 837
Joined: 04 Nov 2016, 10:44

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 24 Jun 2017, 13:14

kevindd992002 wrote:I'm not sure if this is off topic or not but here it goes. I initially thought it was a CPU bottleneck but then I checked the maximum thread usage while playing CSGO @ 1080p with my 1080Ti. It only reaches around 88% which means that it's not a CPU bottleneck. But then my GPU utilization/load is just effing 30%!
RealNC is correct, single core speed is what is going to make the largest difference in CS:GO. Not to stay off-topic, but have you tried "Multicore Rendering" both on and off in the game settings?
Author: Blur Busters "G-SYNC 101" Series

Display: Acer Predator XB271HU OS: Windows 10 Pro MB: ASUS ROG Maximus X Hero CPU: i7-8700k GPU: EVGA GTX 1080 Ti FTW3 RAM: 32GB G.SKILL TridentZ @3200MHz

Post Reply