Blur Buster's G-SYNC 101 Series Discussion

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
User avatar
jorimt
Posts: 1130
Joined: 04 Nov 2016, 10:44

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 31 May 2020, 21:39

hmukos wrote:
31 May 2020, 15:25
Would Min and Max be much closer if the test was made on pure 60hz monitor with 1/60s cable and panel scanout?
Can't say, as I haven't tested a native 60Hz panel with the same method, but theoretically, yes.

And not to go broken record, but, what can be said, is again, regardless of cable/panel scanout method, higher refresh rates will always have a potential advantage over lower refresh rates at any given point, such as when testing with the "first on-screen reaction" method in V-SYNC off scenarios, as (again) depicted here:

Image

That said, by how much is going to depend on the given displays being tested against each other (even in otherwise like-for-like scenarios), and as you may now understand, any direct comparisons between, say, 60Hz and 240Hz, is difficult, if not impractical, since both are physically operating on a different time scale, even in the same scenarios.

Bottom-line though, is so long as it doesn't compromise on overdrive performance, there is no reason to not opt for a display with a faster scanout over a slower one if peak responsiveness is the number one priority; even if you can't reach 360 FPS on a 360Hz display, for instance, you're still benefiting from the decreased scanout time, including increased delivery speed, and reduction of visible tearing artifacts over lower refresh rate displays with V-SYNC off, regardless of framerate.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Display: Acer Predator XB271HU OS: Windows 10 Pro MB: ASUS ROG Maximus X Hero CPU: i7-8700k GPU: EVGA GTX 1080 Ti FTW3 RAM: 32GB G.SKILL TridentZ @3200MHz

hmukos
Posts: 25
Joined: 16 Apr 2020, 15:41

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by hmukos » 01 Jun 2020, 08:03

jorimt wrote:
31 May 2020, 21:39
And not to go broken record, but, what can be said, is again, regardless of cable/panel scanout method, higher refresh rates will always have a potential advantage over lower refresh rates at any given point, such as when testing with the "first on-screen reaction" method in V-SYNC off scenarios, as (again) depicted here:

Image
Wait, isn't your picture only about fixed scan rate case? Where would advantage in first reaction come in the XG248 case in Chief's picture below?
Chief Blur Buster wrote:
28 May 2020, 11:45
However, some panels are scanrate multisync, such as the ASUS XG248, which has excellent low 60Hz console lag:

Image
Also, one person I know has a 165hz/1440p monitor with G-Sync (Dell S2417DG). I linked this thread to him and he decided to test his monitor with 960fps camera on his phone and he measured that when he enabled 60hz mode the panel scanout time was 10-11ms (in 60hz G-Sync ON and 60hz G-Sync OFF cases). So its quite confusing. Is there a particular reason manufacturer would configure scan rate scaler for 60hz not at minimum 1/165s and not at maximum 1/60s but somewhere in the middle?

User avatar
jorimt
Posts: 1130
Joined: 04 Nov 2016, 10:44

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 01 Jun 2020, 09:07

hmukos wrote:
01 Jun 2020, 08:03
Wait, isn't your picture only about fixed scan rate case? Where would advantage in first reaction come in the XG248 case in Chief's picture below?
Because I'm pretty sure it would then look something like this:

Image

While a single 60Hz scanout now occurs across four 240Hz scanout cycles, and is thus lower lag than compressing it into a single 240Hz scanout per (every four), it still takes a total of 16.6ms to complete; you would only get similar readings between the two when the sample occurs near the top of the scanout (which does happen), but 60Hz still has a higher possible maximum regardless.
hmukos wrote:
01 Jun 2020, 08:03
Also, one person I know has a 165hz/1440p monitor with G-Sync (Dell S2417DG). I linked this thread to him and he decided to test his monitor with 960fps camera on his phone and he measured that when he enabled 60hz mode the panel scanout time was 10-11ms (in 60hz G-Sync ON and 60hz G-Sync OFF cases). So its quite confusing. Is there a particular reason manufacturer would configure scan rate scaler for 60hz not at minimum 1/165s and not at maximum 1/60s but somewhere in the middle?
I'm guessing his phone simply isn't accurate enough to capture an entire scanout down to the ms (too much error margin). It's probably still 16.6ms, otherwise it wouldn't be 60Hz.

Also, with G-SYNC, 60Hz operation in this regard is a non-issue anyway; keep it at 165Hz and limit the FPS to 60, and you get similar behavior to the XG248. E.g. with G-SYNC available, you never want to actually lower your max physical refresh rate to achieve a lower "refresh" rate, and should use an FPS limit instead.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Display: Acer Predator XB271HU OS: Windows 10 Pro MB: ASUS ROG Maximus X Hero CPU: i7-8700k GPU: EVGA GTX 1080 Ti FTW3 RAM: 32GB G.SKILL TridentZ @3200MHz

hmukos
Posts: 25
Joined: 16 Apr 2020, 15:41

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by hmukos » 01 Jun 2020, 11:20

jorimt wrote:
01 Jun 2020, 09:07
Because I'm pretty sure it would then look something like this:

pic

While a single 60Hz scanout now occurs across four 240Hz scanout cycles, and is thus lower lag than compressing it into a single 240Hz scanout per (every four), it still takes a total of 16.6ms to complete; you would only get similar readings between the two when the sample occurs near the top of the scanout (which does happen), but 60Hz still has a higher possible maximum regardless.
As I understood the variance was because of mismatch of cable and panel scanout (3/4th of a time panel wasn't scanning out anything and all the inputs that fell in that range were adding additional delay). Now in this case the panel is always scanning out (panel = cable). You and Chief just said that in this case Min and Max will narrow down. But looking at your last pic it again looks like there should be really big variation.
jorimt wrote:
01 Jun 2020, 09:07
It's probably still 16.6ms, otherwise it wouldn't be 60Hz.
This is a bit confusing. Wasn't Chief explicitly saying that there are many cases where there is 60hz mode but really fast (1/240s) panel scanout?
jorimt wrote:
01 Jun 2020, 09:07
I'm guessing his phone simply isn't accurate enough to capture an entire scanout down to the ms (too much error margin).
His phone was precise enough (true 960fps at low resolution and 200ms max capture time).
jorimt wrote:
01 Jun 2020, 09:07
Also, with G-SYNC, 60Hz operation in this regard is a non-issue anyway; keep it at 165Hz and limit the FPS to 60, and you get similar behavior to the XG248. E.g. with G-SYNC available, you never want to actually lower your max physical refresh rate to achieve a lower "refresh" rate, and should use an FPS limit instead.
Yes, of course. It was set only for testing purposes.

User avatar
Chief Blur Buster
Site Admin
Posts: 7712
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Chief Blur Buster » 01 Jun 2020, 12:29

jorimt wrote:
01 Jun 2020, 09:07
hmukos wrote:
01 Jun 2020, 08:03
Wait, isn't your picture only about fixed scan rate case? Where would advantage in first reaction come in the XG248 case in Chief's picture below?
Because I'm pretty sure it would then look something like this:

Image
Sync technology matters.
And lag measurement method matters (single point photodiode versus first-anywhere measurement)

Corresponding latency for a game stimuli, assuming full-screen reaction (So first-anywhere is identical).
For VSYNC OFF, won't be delayed. The new frame will tear earlier at 60Hz, but will tear briefer (1/4th the frameslice height at 240Hz).

Now imagine the frameslices are color coded. The color code location will determine the location of the lag. The image is simply vertically stretched in the diagram only to align the scanouy positions between 60Hz and 240Hz. Frameslices are 4x taller at 240Hz cable scanout than 60Hz cable scanout.

To make this easier to imagine, it'll be the exact same aligned frameslice. Instead of grey-darkgrey, imagine separate colors for each frameslice. The color will be the lag of that frameslice, relative to top edge.

So 4.2ms is only 1/4th down the 60Hz scanout.

So TL;DR for VSYNC OFF, for a T+3.5ms or T+4.0ms sample, as indicated in red;
from a screen vertical position perspective
(A) 60Hz visibility will be nearer top edge of screen, but a tiny frameslice
(B) 240Hz visibility will be nearer bottom edge of screen, but a 4x height frameslice

from a latency perspective:
60Hz and 240Hz frameslice absolute latency is the same (albiet different vertically-stretched latency gradients)

However, realworld latency to human vision and fullscreen cameras is lower because of the 4x-height frameslice, delivering more imagery per refresh cycle at 240Hz, despite same absolute latency. This creates lower "first onscreen reaction" latencies at 240Hz.

The problem is game reactions aren't always global fullscreen changes. Which creates an interation to the fact that not all pixels refresh at the same time as seen in those videos. Since first-anywhere reactions are sometimes tiny objects, the fact that 60Hz frameslices will be 1/4th the height of 240Hz frameslices at the same framerate, will usually miss those reactions happening elsewhere. Therefore, statistically sampled over hundreds and thousands of samples (like G-SYNC 101), the "first-anywhere" reactions will always be lower for 240Hz, even if absolute latency is identical for 60Hz and 240Hz (for a pixel-for-pixel latency), even for scanrate-multisync panels too! The faster scanout spews more pixels per frameslice at higher Hz, helping to increase statistical likelihood that a game stimuli is displayed as photons sooner to human eyeballs.

To throw another complication into matters, frameslices are individual latency gradients, so 2000fps at consitent frametimes = a latency gradient of 1/2000sec from the top edge to bottom edge of frameslice.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

User avatar
jorimt
Posts: 1130
Joined: 04 Nov 2016, 10:44

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 01 Jun 2020, 12:40

Chief Blur Buster wrote:
01 Jun 2020, 12:29
Sync technology matters.

For G-SYNC and VSYNC ON, yep!

For VSYNC OFF, nope.

(It'll be the same aligned frameslice)
To clarify, I didn't mean the same sample in both instances (e.g. if they were simultaneously running side by side), I meant the possible difference between positional appearance of a single sample (regardless of cable/panel method) when measuring using the "First on-screen" reaction method inside a single scanout cycle respective of each scanout's max refresh rate:

Image

In other words, I'm talking about the possible maximum reading differences in separate occurrences of a single sample in an individual scanout cycle per, not if the two scanouts are theoretically linked/synced with the same sample placement. E.g. a single sample within a single scanout cycle can appear anywhere between 0 - 16.6ms @60Hz and 0 - 4.2ms @240hz.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Display: Acer Predator XB271HU OS: Windows 10 Pro MB: ASUS ROG Maximus X Hero CPU: i7-8700k GPU: EVGA GTX 1080 Ti FTW3 RAM: 32GB G.SKILL TridentZ @3200MHz

User avatar
Chief Blur Buster
Site Admin
Posts: 7712
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Chief Blur Buster » 01 Jun 2020, 12:47

Yes, different contexts -- I've edited my post. Can you reread my post and let me know if I've adequately clarified.

Lag measurement methodology is actually a tricky booby-trapped box, where latencies are both same AND different simultaneously.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

User avatar
jorimt
Posts: 1130
Joined: 04 Nov 2016, 10:44

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 01 Jun 2020, 13:41

Chief Blur Buster wrote:
01 Jun 2020, 12:47
Can you reread my post and let me know if I've adequately clarified.
Yup, looks good to me.
Chief Blur Buster wrote:
01 Jun 2020, 12:47
Lag measurement methodology is actually a tricky booby-trapped box, where latencies are both same AND different simultaneously.
Indeed ;)
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Display: Acer Predator XB271HU OS: Windows 10 Pro MB: ASUS ROG Maximus X Hero CPU: i7-8700k GPU: EVGA GTX 1080 Ti FTW3 RAM: 32GB G.SKILL TridentZ @3200MHz

hmukos
Posts: 25
Joined: 16 Apr 2020, 15:41

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by hmukos » 01 Jun 2020, 13:57

I thought that "sample" meant time at which frame was updated with global reaction. But you meant by "sample" that the reaction was only at that vertical position of the frame, right? If so, than I understand what you both were talking about. If the change happens only in tiny portion of a frame than indeed at 60hz first reaction will be slower on average and I see why. Thanks, jorimt, Chief!

User avatar
jorimt
Posts: 1130
Joined: 04 Nov 2016, 10:44

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 01 Jun 2020, 15:12

hmukos wrote:
01 Jun 2020, 13:57
I thought that "sample" meant time at which frame was updated with global reaction. But you meant by "sample" that the reaction was only at that vertical position of the frame, right?
A single "sample" with the "first on-screen reaction" method is considered the "first" discernible movement of the "first" frame update that appears at the given vertical position screen-wide due directly to the last user input. The remaining duration of that frame update (and/or any further frame update) is ignored until the screen becomes "static" again, repeat.

This is why it was so important to pick a static scene and keep it static (no idle character animations, foreground activity, etc) to be able to spot these changes (especially with V-SYNC off) without spotting any possible false-positives.

The goal is to simply see how quick click to on-screen reaction is. It's actually a pretty harsh method, and worst case scenario for all scenarios tested, but the most accurate in this context.

When testing with any syncing method, the first vertical update is always going to appear at the very top of the screen (which is easy to spot), whereas when testing with V-SYNC off, the first vertical update can appear anywhere on the vertical axis (and often does, which can be much less easy to spot, especially where the tear slices become so thin with certain framerate/refresh rate ratios).
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Display: Acer Predator XB271HU OS: Windows 10 Pro MB: ASUS ROG Maximus X Hero CPU: i7-8700k GPU: EVGA GTX 1080 Ti FTW3 RAM: 32GB G.SKILL TridentZ @3200MHz

Post Reply