EXCLUSIVE: We measure G-SYNC Input Lag in GSYNC Preview #2!

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
User avatar
nimbulan
Posts: 323
Joined: 29 Dec 2013, 23:32
Location: Oregon

Re: EXCLUSIVE: We measure G-SYNC Input Lag in GSYNC Preview

Post by nimbulan » 10 Feb 2014, 13:26

Berserker wrote:Multicore rendering has noticeable input lag in all source engine games. Very easy to notice difference by toggling it on and off with high graphical settings. On absolute lowest settings with fps config it gets more tolerable.
For me personally its even more annoying than oneframethread lag in UE3.

And also this:
http://renderingpipeline.com/2013/09/me ... t-latency/
HL2 with vsync: 112 msec
HL2 without vsync: 76 msec
HL2 without vsync and drastically reduced image quality: 58 msec
HL2 without vsync, reduced quality and no multicore renderer: 50 msec
It would have been helpful if this guy had given details on his framerate during the tests. I know it's not a direct comparison, but it's interesting to see how much a gaming monitor helps lag in the Source engine from Chief Blur Buster's input lag testing. Do you know of a website that has done some thorough testing of oneframethreadlag? I've always been curious but never been able to find any.

User avatar
shadman
Posts: 95
Joined: 19 Dec 2013, 16:39
Location: West Coast

Re: EXCLUSIVE: We measure G-SYNC Input Lag in GSYNC Preview

Post by shadman » 10 Feb 2014, 15:10

Wow, I've never known there was input lag due to that setting being on. I'll definitely test this later.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: EXCLUSIVE: We measure G-SYNC Input Lag in GSYNC Preview

Post by Chief Blur Buster » 18 Mar 2014, 20:58

Multicore rendering has been pretty good for slower systems to get more framerate. However, modern systems perform so quickly with older Source Engine games, that multicore rendering can simply be turned off to reduce latency.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
sharknice
Posts: 295
Joined: 23 Dec 2013, 17:16
Location: Minnesota
Contact:

Re: EXCLUSIVE: We measure G-SYNC Input Lag in GSYNC Preview

Post by sharknice » 18 Mar 2014, 23:09

Chief Blur Buster wrote:Multicore rendering has been pretty good for slower systems to get more framerate. However, modern systems perform so quickly with older Source Engine games, that multicore rendering can simply be turned off to reduce latency.
Yeah as far as I can tell instead of every core being combined to render a single frame faster each core produces its own frame. Since the processing all happens at once it can't grab input or any new data that hasn't happened yet. The extra frames rendered from the extra cores are based on interpolation from previous data instead of new data.

So it is only worth using if you can't get as many frames as your refresh rate and are CPU bottlenecked.

User avatar
RealNC
Site Admin
Posts: 3757
Joined: 24 Dec 2013, 18:32
Contact:

Re: EXCLUSIVE: We measure G-SYNC Input Lag in GSYNC Preview

Post by RealNC » 19 Mar 2014, 09:24

Chief Blur Buster wrote:Multicore rendering has been pretty good for slower systems to get more framerate. However, modern systems perform so quickly with older Source Engine games, that multicore rendering can simply be turned off to reduce latency.
I can't confirm that. Without multicore, framerates in CSGO fall below 100 very often. In some maps, they approach 60 (example: Assault.) With these framerates, the tearing becomes distracting. With multicore enabled (and "-threads 3" in the game's command-line options), FPS stays mostly above 200, which makes tearing non-visible (or at least hardly noticeable) with vsync off.

And this is on an i5 2500K @ 4.2GHz with a GTX 780, which I would describe as "modern."

Note: these numbers don't apply to offline mode or an empty online server. Online mode with other people on the server is resulting in some real CPU load, bringing framerates down. Yes, if I just load a map in offline mode, I get 600FPS with multicore and over 200 without. But that's not helpful; online mode is what matters. And in modern times, the trend for speed increases in CPUs is to put more cores in them.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
nimbulan
Posts: 323
Joined: 29 Dec 2013, 23:32
Location: Oregon

Re: EXCLUSIVE: We measure G-SYNC Input Lag in GSYNC Preview

Post by nimbulan » 19 Mar 2014, 13:24

RealNC wrote:
Chief Blur Buster wrote:Multicore rendering has been pretty good for slower systems to get more framerate. However, modern systems perform so quickly with older Source Engine games, that multicore rendering can simply be turned off to reduce latency.
I can't confirm that. Without multicore, framerates in CSGO fall below 100 very often. In some maps, they approach 60 (example: Assault.) With these framerates, the tearing becomes distracting. With multicore enabled (and "-threads 3" in the game's command-line options), FPS stays mostly above 200, which makes tearing non-visible (or at least hardly noticeable) with vsync off.

And this is on an i5 2500K @ 4.2GHz with a GTX 780, which I would describe as "modern."

Note: these numbers don't apply to offline mode or an empty online server. Online mode with other people on the server is resulting in some real CPU load, bringing framerates down. Yes, if I just load a map in offline mode, I get 600FPS with multicore and over 200 without. But that's not helpful; online mode is what matters. And in modern times, the trend for speed increases in CPUs is to put more cores in them.
I've experimented with this before too, and it all seems to be the result of the Source engine getting slower and more bloated over time. When TF2 was released (I believe this is before multicore rendering was implemented,) I had an Athlon x2 3800+ and a 7800 GT which would easily handle the game on max settings. Two years later that same system would run the game on low at an average of maybe 20 fps. My Core 2 Quad q9550 + GTX 260 would run the game easily on max settings at that point, but only with multicore rendering on which caused frequent video driver crashes. Multicore rendering off resulted in the framerate dropping below 30 very frequently. The engine update for CSS caused a pretty large drop in framerate as well, even though the game received no graphical improvements in the process.

So basically after a couple of years and a significant PC upgrade, I had to use multicore rendering to keep the same level of performance I had on older hardware with it off. I really hope Source 2 is more efficient.

YukonTrooper
Posts: 19
Joined: 06 Feb 2014, 23:24

Re: EXCLUSIVE: We measure G-SYNC Input Lag in GSYNC Preview

Post by YukonTrooper » 19 Apr 2014, 01:46

Any update on the G-Sync max framerate input lag scenario? There's definitely increased input lag in every game I've played if max framerate is set to the monitors refresh rate. I've just resigned myself to maxing my frames 10fps below the refresh rate and the input lag disappears. There aren't any tradeoffs (everything remains butter smooth), but it would be nice to have a quantitative explanation.

User avatar
RealNC
Site Admin
Posts: 3757
Joined: 24 Dec 2013, 18:32
Contact:

Re: EXCLUSIVE: We measure G-SYNC Input Lag in GSYNC Preview

Post by RealNC » 19 Apr 2014, 09:06

YukonTrooper wrote:it would be nice to have a quantitative explanation.
You mean why it happens? That's already clear. G-Sync makes the monitor follow the GPU instead of the other way around (as it was with V-Sync.) However, if the GPU is faster than the monitor, then obviously the monitor can't follow the GPU anymore. So you get in the situation where the GPU waits for the monitor to catch up again; V-Sync.

By frame capping the game, you're making sure that the GPU is not going to be faster than the monitor, so the monitor can keep following the GPU.

In the above, "following" means keeping up with new frames from the GPU. Whenever the GPU sends a new frame, the monitor "follows up" by displaying that frame. If the GPU sends frames faster than the monitor can display them, then the GPU needs to wait for the monitor to get ready to display the next frame, which is what V-Sync always did and why input lag appears.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: EXCLUSIVE: We measure G-SYNC Input Lag in GSYNC Preview

Post by Chief Blur Buster » 19 Apr 2014, 09:25

YukonTrooper wrote:Any update on the G-Sync max framerate input lag scenario? There's definitely increased input lag in every game I've played if max framerate is set to the monitors refresh rate. I've just resigned myself to maxing my frames 10fps below the refresh rate and the input lag disappears. There aren't any tradeoffs (everything remains butter smooth), but it would be nice to have a quantitative explanation.
Do you mean why the lag happens at all? Or why it is 10fps below versus 1fps below?

The 10fps versus 1fps below is probably because of the GSYNC poll.
However, the explanation why the lag happens at all, is separate from the explanation why the lag disappears if you go ~10fps below rather than ~1fps below.

There's some explanations in GSYNC Preview #2.
Why is there less lag in CS:GO at 120fps than 143fps for G-SYNC?

We currently suspect that fps_max 143 is frequently colliding near the G-SYNC frame rate cap, possibly having something to do with NVIDIA’s technique in polling the monitor whether the monitor is ready for the next refresh. I did hear they are working on eliminating polling behavior, so that eventually G-SYNC frames can begin delivering immediately upon monitor readiness, even if it means simply waiting a fraction of a millisecond in situations where the monitor is nearly finished with its previous refresh.

I did not test other fps_max settings such as fps_max 130, fps_max 140, which might get closer to the G-SYNC cap without triggering the G-SYNC capped-out slow down behavior. Normally, G-SYNC eliminates waiting for the monitor’s next refresh interval:

G-SYNC Not Capped Out
Input Read -> Render Frame -> Display Refresh Immediately

When G-SYNC is capped out at maximum refresh rate, the behavior is identical to VSYNC ON, where the game ends up waiting for the refresh.

G-SYNC Capped Out
Input Read -> Render Frame -> Wait For Monitor Refresh Cycle -> Display Refresh
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
nimbulan
Posts: 323
Joined: 29 Dec 2013, 23:32
Location: Oregon

Re: EXCLUSIVE: We measure G-SYNC Input Lag in GSYNC Preview

Post by nimbulan » 19 Apr 2014, 11:41

The video card waiting for the monitor is part of it, but you guys are missing the larger piece of the problem: DirectX's dreaded render ahead queue. When your framerate is limited by vsync, the game will continue processing data for future frames, building up a queue to be rendered and displayed and each frame in the queue will increase your input lag by your refresh cycle time. The default setting in the nVidia drivers is 2 frames though I recommend setting it to 1. Games may be able to override this setting though. You unfortunately can't disable it though I used to be able to when I had a GeForce 7800 years ago.

Post Reply