g-sync input lag vs XL2411T vs XL2420Z vs VG248QE

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
Q83Ia7ta
Posts: 761
Joined: 18 Dec 2013, 09:29

g-sync input lag vs XL2411T vs XL2420Z vs VG248QE

Post by Q83Ia7ta » 05 Mar 2014, 21:57


User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: g-sync input lag vs XL2411T vs XL2420Z vs VG248QE

Post by Chief Blur Buster » 06 Mar 2014, 02:49

DisplayPort has the potential to transmit refresh cycles faster than DVI and other ports. It's theoretically possible to push a 1080p refresh cycle over a single DisplayPort channel in approximately 5 milliseconds (IIRC, 1/177th of a second, to be precise).

As for their GSYNC input lag tests -- duh! -- Old news (and incomplete, too). I already found out that too. GSYNC will begin to behave as VSYNC ON when frame rates caps out to maximum. I do not believe the site tested the use of fps_max as a method of reducing input lag.

But as I already wrote, you can take advantage of GSYNC's low input lag behavior by capping the frame rate below the GSYNC maximum. By using fps_max values below GSYNC maximum, you eliminate the driver waiting for VSYNC.

The people at infinite.cz needs to point out the frame capping technique of reducing GSYNC input lag. You can take advantage of GSYNC's low input lag behavior by capping the frame rate below the GSYNC maximum. By using fps_max values below GSYNC maximum, you eliminate the driver waiting for VSYNC. Instead, the driver delivers the refresh immediately to the monitor.

1. Set your GSYNC monitor to 144Hz
2. Modify your test to run at a maximum frame rate of ~135fps
3. You will observe that input lag falls dramatically.

This is because the input lag is caused by driver waiting for the previous refresh cycle to finish before GSYNC. Same kind of issue as VSYNC ON. But if the monitor has already finished refreshing (e.g. you software-based-throttle your framerate, e.g. fps_max), the frame is delivered immediately after your input read without waiting.

There are many games with built-in configurable frame rate capping, and software-based frame rate capping eliminates the input lag caused by hardware capping (e.g. frame rate limited by via driver, frame rate limited by VSYNC ON, etc) because you did an input read, then the external factor (outside of game control) forced an input lag upon you beyond your control -- before displaying the frame. But! That's solved by building a frame rate cap into your game. If you are a game developer, you are smart if you included "fps_max" like the Source Engine. As game developer, you then successfully control the GSYNC input lag via this technique!. It solves the GSYNC input lag in my tests.

That's because game engines often do a keyboard/mouse read [input lag time begins], then renders and presents the frame (e.g. Direct3D Present()), and monitor displays frame [input lag time ends]. If the driver or monitor blocks when you attempt to display the frame, then you've got a forced input lag beyond your control. That's what happens when you let GSYNC hit maximum refresh rate. (same issue as VSYNC ON). Because it's now waiting for the previous refresh to finish first -- that's input lag! But as game developer, one can solve this, bring this into your own game applications control (if you are a game developer) simply by running at framerates slightly lower than the currently configured GSYNC maximum refresh rates. When this happens, your frames are immediately delivered (because previous frames are already fully finished refreshed, so no waiting at all), there's no blockage, no added lag, because you're not hitting the maximum. The average latency becomes equivalent to VSYNC OFF, and you get ultralow GSYNC latency with the full benefits of GSYNC fluidity with no tearing.

Therefore, infinite.cz has made a mistake by incomplete test (only testing out full frame rate). They need to display benchmark numbers during framerate-capped situations. They didn't publish that. They need to modify their input latency test to also test software-base frame rate capping, like the frame-capping capabilities currently found in several game engines now. Their software application that they used for the photodiode needs to have a configurable fps_max value, and I don't think they added that. Thus their lag testing app isn't functioning the same way as a game engine (that has a built in fps_max ability). They need to make their lag testing technique fully explained. fps_max in CS:GO essentially configures the refresh rate of a GSYNC monitor, and using fps_max during GSYNC fixes the input lag that infinite.cz is reporting.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: g-sync input lag vs XL2411T vs XL2420Z vs VG248QE

Post by RealNC » 06 Mar 2014, 03:25

Is this problem being worked on by NVidia? There are hundreds (actually thousands is more likely) of games out there without an internal frame limiter. You can be sure that virtually none of them will be updated.

This is a real problem. I'm not intending to get a GSync monitor just for future games. I also love my existing games and want to play them with GSync (still waiting for the XL2[47]20G, when the hell is it coming out?)
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: g-sync input lag vs XL2411T vs XL2420Z vs VG248QE

Post by Chief Blur Buster » 06 Mar 2014, 03:54

RealNC wrote:Is this problem being worked on by NVidia?
One feature suggestion is a driver-based tail-end frame rate limiter, which I may pass to NVIDIA in the next month. Basically, it could reduce input lag during GSYNC capping-out situations in certain games (but not all of them). Basically, a Direct3D Present() blocking that immediately displays but doesn't return until the monitor has finished refreshing. That way, the next input read is fresh & the next Present() call will begin refreshing immediately (and then not return until finished refreshing). This is much lower lag than lead-limiting where you do an input read, then Present() which is forced to wait (doesn't display frame -- creates input lag) because the monitor isn't finished refreshing the last refresh. However, blocking Present() in this manner, may cause games to run at slightly lower frame rates (e.g. lower frame rate than monitor's maximum rate) but would no longer have the GSYNC cap-out latency penalty.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: g-sync input lag vs XL2411T vs XL2420Z vs VG248QE

Post by RealNC » 06 Mar 2014, 04:09

Basically, that approach would look like the GPU performs slower (a virtual "downgrade") so that you get less FPS? That's rather neat!

Hm, this also means that with games that hit the Hz maximum, downclocking the GPU should fix the problem too. But I doubt the driver can be smart enough to decide when to downclock, and the hardware not fast enough to change clocks quickly enough (or even fine grained enough) and the voltage regulators might die sooner.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: g-sync input lag vs XL2411T vs XL2420Z vs VG248QE

Post by Chief Blur Buster » 06 Mar 2014, 05:38

RealNC wrote:Basically, that approach would look like the GPU performs slower (a virtual "downgrade") so that you get less FPS? That's rather neat!
It may be only <1fps to about ~5fps slower from full max frame rate, but in that specific case, you're trading a tiny bit, for a very significant input latency decrease. It's not too different from several CS:GO users doing fps_max 143 when playing. Ideally, the drivers should be adjusted so that there is no need for this differential, and be able to run at full frame rate, but for now, fps_max 135-138 works great with virtually the same average latency as VSYNC OFF.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

ENiKS
Posts: 6
Joined: 06 Mar 2014, 05:07

Re: g-sync input lag vs XL2411T vs XL2420Z vs VG248QE

Post by ENiKS » 06 Mar 2014, 07:50

Greetings to all people on this forum,

I'm the person who wrote the article in first post and did the test :)

First of all, regarding the Display Port latency being low - It's simple the value I've measured, i did it 3 times just to be sure it's the actual value. I've measured similar value on my Alienware M15x.

Secondly, you're correct that i didn't test it with any fps cap. To explain the method, I'm using an old opensource engine where i implemented a custom screen that goes from black to white and using the engine input system, i read the inputs from sync tool. The sync tool is a HID chip that behaves like gamepad, except it has only one button (start sync), photo-transistor to measure light and trigger a button once light reaches certain level (adjustable) and a microphone to measure audio-latency. The latency is then calculated by the time from the screen logic (not Present() ) sends white rectangle all over the screen to the time input system registers button press caused by light. The input system run in parallel thread to ensure it's independent from framerate, and sends messages to main loop with time stamp when the input was registered. The timestamp value is used to calculate latency.

Anyway, i did add a framerate cap to my lag testing utility, capping at 125fps while running 144Hz monitor with GSYNC mode, the screen latency was between 16-20 ms. This is still not as low as no sync (14ms), but definitively better than VSYNC (39ms). So thanks for letting me know and I'll update the original according to this. All i did was to add some waiting cycles below rendering code, before next loop begins.

I'm still quite confused, because one scan at 144 Hz is 6.9ms, so even if the Present() waited for full cycle, this cannot explain the difference, which is more than 20ms. But there are more things about GSYNC that i don't get (why the hell does the module in monitor have 6GB DDR3 memory)

EDIT:
Also comes to my mind, what will happen when game FPS will oscillate around GSYNC max refresh around, like 130-160 depending on where player looks. Will this introduce stutter once again? Has anyone tested it? (If not, i can recode the frame limiter and try it). If it does, i would consider it serious bug.

Q83Ia7ta
Posts: 761
Joined: 18 Dec 2013, 09:29

Re: g-sync input lag vs XL2411T vs XL2420Z vs VG248QE

Post by Q83Ia7ta » 06 Mar 2014, 10:11

For myself i found most interesting last test when display refresh set at 144hz, vsync off, gsync off.
BenQ XL2411T: 28 ms
BenQ XL2420Z: 27 ms
ASUS VG248QE: 24 ms
ASUS VG248QE(with GSYNC): 14 ms
I guess all monitors except GSYNC one connected with DVI.

ENiKS
Posts: 6
Joined: 06 Mar 2014, 05:07

Re: g-sync input lag vs XL2411T vs XL2420Z vs VG248QE

Post by ENiKS » 06 Mar 2014, 10:59

Correct. All used Dual Link DVI cable. Unfortunately i had to return the 2420Z and 248QE was used as donor for gsync. I can test DVI vs DP on my home XL2420T. But i mostly credited it to the fact that gsync monitor have no scaler or anything else that could slow the signal (the only feature on OSD is brightness, which is about backlight not picture itself), but i might be wrong.

I remember from my older tests that VGA was about 2ms faster than DVI.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: g-sync input lag vs XL2411T vs XL2420Z vs VG248QE

Post by Chief Blur Buster » 06 Mar 2014, 11:20

ENiKS wrote:Anyway, i did add a framerate cap to my lag testing utility, capping at 125fps while running 144Hz monitor with GSYNC mode, the screen latency was between 16-20 ms. This is still not as low as no sync (14ms), but definitively better than VSYNC (39ms). So thanks for letting me know and I'll update the original according to this. All i did was to add some waiting cycles below rendering code, before next loop begins.

I'm still quite confused, because one scan at 144 Hz is 6.9ms, so even if the Present() waited for full cycle, this cannot explain the difference, which is more than 20ms. But there are more things about GSYNC that i don't get (why the hell does the module in monitor have 6GB DDR3 memory)
Glad you noticed, and you are welcome!

6GB DDR3? It's 768MB. (unless you meant 768 megaBYTES = 6 gigaBITS)
NVIDIA told me they have that much because they need bandwidth, they need to run multiple chips in parallel (tri-channel memory bandwidth). Not all of the memory is used up. They use three 256MB chips. Those are the cheapest of the fast variety, so the chips are bigger than they needed.

And, yes, that's why in my GSYNC lag tests ( http://www.blurbusters.com/gsync/preview2/ -- look at Counterstrike: GO) I noticed using a frame rate cap in a game engine immediately eliminated the input lag penalty. Have you read my GSYNC lag test article before? The 16-20ms versus 14ms may be explainable via the GSYNC poll delay (it has to ping the monitor to check if it's still refreshing), which I heard NVIDIA is trying to fix/eliminate. Soon, the differential won't exist. People mentioned some kind GSYNC poll mechanism (in the initial version of GSYNC) to some people at CES 2014 and it was mentioned they were looking for ways to eliminate the "poll" (pinging the monitor).

So GSYNC is very low lag:
-- whenever a game runs at less than maximum framerate (e.g. extra graphics)
-- whenever you intentionally cap the framerate via the game engine (e.g. fps_max in Source Engine)
You can stay away from the "VSYNC ON"-style lag situation that occurs when the frame rate caps out.

So, your tests appear to be consistent with my tests.
Congratulations for finding a new, additional method of measuring GSYNC input lag!
I assume you were already aware of my GSYNC lag tests, too?
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply