Page 4 of 6

Re: Why do Pro gamers not use G-Sync?

Posted: 13 Aug 2020, 14:54
by jorimt
speancer wrote:
13 Aug 2020, 14:15
You can't deny that no sync is superior overall, with the least possible lag that can be achieved. There is a small difference between G-SYNC and no sync with no fps limit.
I don't think I denied that in any of my previous posts here. I was only referring to G-SYNC vs. no sync with framerates inside the refresh rate.

No sync is indeed "superior" to G-SYNC in raw input lag with framerates well above the refresh rate (even if not by very much at 240Hz+, and my graphs are generous to no sync due to counting first reaction instead of middle screen, where the differences would be more equalized), but the gap between it and G-SYNC closes as the max refresh rate increases due to the ever shrinking duration of individual scanout cycles.

I've mentioned this many times before, but once we hit 1000Hz, there will virtually be no difference between G-SYNC and no sync, at which point you really won't need any syncing method to prevent visible tearing artifacts.

G-SYNC is ultimately a tearing-prevention stopgap due to currently achievable framerate/refresh rate ratios, much like strobing is to motion clarity. Both will ultimately be circumvented and replace by ultra high refresh rates + framerates (be it amplifications techniques and/or native performance) in the future.

Re: Why do Pro gamers not use G-Sync?

Posted: 13 Aug 2020, 22:06
by speancer
jorimt wrote:
13 Aug 2020, 14:54
speancer wrote:
13 Aug 2020, 14:15
You can't deny that no sync is superior overall, with the least possible lag that can be achieved. There is a small difference between G-SYNC and no sync with no fps limit.
I don't think I denied that in any of my previous posts here. I was only referring to G-SYNC vs. no sync with framerates inside the refresh rate.
I've never said you denied that, I just meant it's rather undeniable in general :P
Chief Blur Buster wrote:
25 Apr 2017, 19:28
Personally, I have no love for VSYNC OFF stutter-wise but it reigns supreme in input lag. Until we have 1000Hz gaming monitors, VSYNC OFF probably is still going to continue to reign supreme in PC-gaming eSports.

Re: Why do Pro gamers not use G-Sync?

Posted: 13 Aug 2020, 22:24
by jorimt
speancer wrote:
13 Aug 2020, 22:06
I've never said you denied that, I just meant it's rather undeniable in general :P
Gotcha, and yup, I said as much in my original article over three years ago...

"G-SYNC 101: G-SYNC vs. V-SYNC OFF":
https://blurbusters.com/gsync/gsync101- ... ettings/9/
So, for competitive players, V-SYNC OFF still reigns supreme in the input lag realm, especially if sustained framerates can exceed the refresh rate by 5x or more.

Re: Why do Pro gamers not use G-Sync?

Posted: 20 Aug 2020, 07:22
by Faskill
Is there any use in using GSync when playing at fps significantly below the monitor frequency?
For example when playing at 150fps on a 240Hz monitor. This is only my first hand impressions but I feel like my monitor is more reactive while it’s not trying to match its frequency to my lower fps count (GSync Off)

Re: Why do Pro gamers not use G-Sync?

Posted: 20 Aug 2020, 08:33
by jorimt
Faskill wrote:
20 Aug 2020, 07:22
Is there any use in using GSync when playing at fps significantly below the monitor frequency?
For example when playing at 150fps on a 240Hz monitor.
That's G-SYNC primary intended use. It's what it was originally created for.
Faskill wrote:
20 Aug 2020, 07:22
This is only my first hand impressions but I feel like my monitor is more reactive while it’s not trying to match its frequency to my lower fps count (GSync Off)
Your only "reactive" advantage over G-SYNC with G-SYNC off in that range is the visible tearing; when it tears, there's your one and only "advantage." Otherwise, there's no difference between the two performance/input lag-wise during normal operation.

Re: Why do Pro gamers not use G-Sync?

Posted: 14 Dec 2020, 20:31
by pellson
jorimt wrote:
20 Aug 2020, 08:33
Faskill wrote:
20 Aug 2020, 07:22
Is there any use in using GSync when playing at fps significantly below the monitor frequency?
For example when playing at 150fps on a 240Hz monitor.
That's G-SYNC primary intended use. It's what it was originally created for.
Faskill wrote:
20 Aug 2020, 07:22
This is only my first hand impressions but I feel like my monitor is more reactive while it’s not trying to match its frequency to my lower fps count (GSync Off)
Your only "reactive" advantage over G-SYNC with G-SYNC off in that range is the visible tearing; when it tears, there's your one and only "advantage." Otherwise, there's no difference between the two performance/input lag-wise during normal operation.
This is not true.
I have the Vg27aq and it 100% has an extremely small increase in input lag just by enabling the VRR in the OSD. No, I don't want to hear "that's because you have to cap fps beneath max bla bla" I know all this stuff already how vsync works.

I can instantly feel my mouse get a bit more responsive in windows by disabling it in the osd. But it's so small most wont notice I guess, but I drove me insane coming from the vg258qe, maybe its 3ms but noticible for an ocd guy like me.

Also, rtings mentions this on their review for the vg27aql1.

Vrr enabled: 5.7 ms
Vrr disabled: 3.3ms

Finally after months of ONLY reading comments like "there is no diffrence, you hare hitting vsycn roof, cap your fps bla bla" i have found someone who agrees with me.

Sorry for bad English, and also writing on mobile.

Re: Why do Pro gamers not use G-Sync?

Posted: 14 Dec 2020, 22:06
by Chief Blur Buster
jorimt wrote:
20 Aug 2020, 08:33
Your only "reactive" advantage over G-SYNC with G-SYNC off in that range is the visible tearing; when it tears, there's your one and only "advantage." Otherwise, there's no difference between the two performance/input lag-wise during normal operation.
pellson wrote:
14 Dec 2020, 20:31
I have the Vg27aq and it 100% has an extremely small increase in input lag just by enabling the VRR in the OSD. No, I don't want to hear "that's because you have to cap fps beneath max bla bla" I know all this stuff already how vsync works.
So depending on interpretation/language barrier, both YOU *and* Jorim is right. ;)

Terminologically, language-barrier-wise, there isn't much difference between the two. It is in a matter of interpretation, given the English language (and its strange nuances), and there can be confusion that arises when the writer is non-English.

I think there's a misunderstanding. "When it tears, there's your one and only advantage" -- is the lower latency that jorim is agreeing with you on, as a tradeoff of the tearing.

There is a slight sync-technology-related average absolute latency penalty for VRR versus VSYNC OFF corresponding to one scanout halftime (e.g. 0.5/155sec latency penalty), given randomized tearline placement along the scanout.

The whole screen is a veritcal latency gradient that is typically [0....1/maxHz] on most gaming monitors, at least the ones that has realtime synchronization between cable scanout and panel scanout.

For esports-friendly VRR, I highly recommend higher Hz such as 240Hz or 360Hz, since the scanout halftime (0.5/360) begins to create an almost-negligible VRR-induced scanout latency penalty.

That said, VRR is the world's lowest "non-vsync-OFF" sync technology, generally lower lag than everything else other than VSYNC OFF.

Not all pixels on a display panel refresh at the same time. Both VSYNC OFF and VRR are same latency for scanline number #001 (top edge for VRR, and first scanline of a frameslice for VSYNC OFF). During VSYNC OFF, each frameslice is a latency gradient of [0...frametime] milliseconds, while for any other sync technology, the whole screen surface is a [0...1/maxHz] latency gradient. So the beginning of the latency gradient is identical (+0 adder), but the beginning of the latency gradient (+0) can be closer to the center of the screen, thanks to a tearline occuring right above crosshairs, for example. That's the advantage that tearing gives you.

However, given over 180 countries read Blur Busters and many people's languages aren't the same, and not everyone realizes that latency is not a single number -- it's true we sometimes need to give a bit more detailed answers nowadays. Given devices such as LDAT and other testers often don't account for the latency gradient effect, as single-pixel photodiode tests aren't good at measuring latency gradients of a random VSYNC OFF frameslice, but it is very visible in things like high speed videos (of Slo Mo Guys fame) or when putting multiple photodiodes simultaneously all over the screen.

First, screens refresh from top-to-bottom, as seen in high speed videos, which creates interesting behaviours for GSYNC versus VSYNC OFF.

Example of VSYNC OFF sub-divided latency gradients

Image

Scanout: GSYNC 100fps at 144Hz

Image

Scanout: VSYNC OFF 100fps at 144Hz

Image

As you can see, the latency gradients are different [0...1/maxHz] versus [0...frametime] for VRR versus VSYNC-OFF. So minimum latency adder (+0) is identical for both (Jorim is correct), but maximum is different (pellson is correct).

Remember: Not All Pixels Refresh At The Same Time!!
Remember: Not All Pixels Are The Same Latency!!

pellson wrote:
14 Dec 2020, 20:31
Sorry for bad English, and also writing on mobile.
Many forum members here are not English-language-first.

I'm kind of a stickler about terminology around here to minimize disagreements, given the international nature of our audience, since latency is not a single number, nor for each pixel on the entire screen surface.

VSYNC OFF is more responsive for esports use because of its reliance on lower average latency (midpoint MIN...MAX) at the cost of motion fluidity / tearing. For average absolute latency, VSYNC OFF is hard to beat (especially great in games such as CS:GO), while G-SYNC (at extreme refresh rates) can still help give advantage in different games that are stutter-prone enough to interfere with aiming.

Re: Why do Pro gamers not use G-Sync?

Posted: 16 Dec 2020, 15:27
by pellson
Chief Blur Buster wrote:
14 Dec 2020, 22:06
jorimt wrote:
20 Aug 2020, 08:33
Your only "reactive" advantage over G-SYNC with G-SYNC off in that range is the visible tearing; when it tears, there's your one and only "advantage." Otherwise, there's no difference between the two performance/input lag-wise during normal operation.
pellson wrote:
14 Dec 2020, 20:31
I have the Vg27aq and it 100% has an extremely small increase in input lag just by enabling the VRR in the OSD. No, I don't want to hear "that's because you have to cap fps beneath max bla bla" I know all this stuff already how vsync works.
So depending on interpretation/language barrier, both YOU *and* Jorim is right. ;)

Terminologically, language-barrier-wise, there isn't much difference between the two. It is in a matter of interpretation, given the English language (and its strange nuances), and there can be confusion that arises when the writer is non-English.

I think there's a misunderstanding. "When it tears, there's your one and only advantage" -- is the lower latency that jorim is agreeing with you on, as a tradeoff of the tearing.

There is a slight sync-technology-related average absolute latency penalty for VRR versus VSYNC OFF corresponding to one scanout halftime (e.g. 0.5/155sec latency penalty), given randomized tearline placement along the scanout.

The whole screen is a veritcal latency gradient that is typically [0....1/maxHz] on most gaming monitors, at least the ones that has realtime synchronization between cable scanout and panel scanout.

For esports-friendly VRR, I highly recommend higher Hz such as 240Hz or 360Hz, since the scanout halftime (0.5/360) begins to create an almost-negligible VRR-induced scanout latency penalty.

That said, VRR is the world's lowest "non-vsync-OFF" sync technology, generally lower lag than everything else other than VSYNC OFF.

Not all pixels on a display panel refresh at the same time. Both VSYNC OFF and VRR are same latency for scanline number #001 (top edge for VRR, and first scanline of a frameslice for VSYNC OFF). During VSYNC OFF, each frameslice is a latency gradient of [0...frametime] milliseconds, while for any other sync technology, the whole screen surface is a [0...1/maxHz] latency gradient. So the beginning of the latency gradient is identical (+0 adder), but the beginning of the latency gradient (+0) can be closer to the center of the screen, thanks to a tearline occuring right above crosshairs, for example. That's the advantage that tearing gives you.

However, given over 180 countries read Blur Busters and many people's languages aren't the same, and not everyone realizes that latency is not a single number -- it's true we sometimes need to give a bit more detailed answers nowadays. Given devices such as LDAT and other testers often don't account for the latency gradient effect, as single-pixel photodiode tests aren't good at measuring latency gradients of a random VSYNC OFF frameslice, but it is very visible in things like high speed videos (of Slo Mo Guys fame) or when putting multiple photodiodes simultaneously all over the screen
Thanks for a well explained post ❤️
But I'm not talking about when VRR is engaged (fps below max hz) . I run all my games fps well above the max hz of my monitor, so the vrr should be disengaged att all times. But still, I notice a very very small, but enough, by using the cursor in Windows or playing CSGO.

I hope you understand what I mean, this mostly messes with me in CSGO where I have 400fps and have to play competetvily.

In single player games I couldn't care this much about such a small input lag, hell, I play then with vsync :)

Sorry its hard to explain, but I'm talking about in all scenarios vsync off, 400fps and the only diffrence is that VRR is disabled in the monitor OSD.

Re: Why do Pro gamers not use G-Sync?

Posted: 16 Dec 2020, 15:52
by jorimt
pellson wrote:
16 Dec 2020, 15:27
I'm talking about in all scenarios vsync off, 400fps and the only diffrence is that VRR is disabled in the monitor OSD.
Then when you replied to me "This is not true," you're misunderstanding my previous posts, because I was referring to G-SYNC mode on/off within the refresh rate, and more specifically, to native G-SYNC monitors.

On native G-SYNC monitors, the module replaces the panel scaler, so whether G-SYNC mode is on or off, the signal goes through the same component.

With G-SYNC Compatible FreeSync monitors, however, there is no module, just the panel scaler, and VRR implementation is predominately handled by the display driver itself on the GPU-side instead.

As such, it's theoretically possible there could be an input lag difference with VRR on/off (unrelated to sync-induced lag) with certain FreeSync displays, it just hasn't been formally or thoroughly tested.

Re: Why do Pro gamers not use G-Sync?

Posted: 16 Dec 2020, 19:03
by pellson
jorimt wrote:
16 Dec 2020, 15:52
pellson wrote:
16 Dec 2020, 15:27
I'm talking about in all scenarios vsync off, 400fps and the only diffrence is that VRR is disabled in the monitor OSD.
Then when you replied to me "This is not true," you're misunderstanding my previous posts, because I was referring to G-SYNC mode on/off within the refresh rate, and more specifically, to native G-SYNC monitors.

On native G-SYNC monitors, the module replaces the panel scaler, so whether G-SYNC mode is on or off, the signal goes through the same component.

With G-SYNC Compatible FreeSync monitors, however, there is no module, just the panel scaler, and VRR implementation is predominately handled by the display driver itself on the GPU-side instead.

As such, it's theoretically possible there could be an input lag difference with VRR on/off (unrelated to sync-induced lag) with certain FreeSync displays, it just hasn't been formally or thoroughly tested.
You're right, I'm sorry.
I'm using VG27AQ which uses FreeSync.
I'm a little curious why nobody seems to acknowledge this phenomen. This thread was the closest I found after hours of searching. Two of my friends couldn't feel a diffrence at all, but I certainly feel its a bit sloppier in fast fps games with VRR On in OSD.

Sorry for coming across as an asshole, I just wanted to be understood after all people telling me "just turn off Vsync" when that's not even what I'm talking about.