Why do Pro gamers not use G-Sync?

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Why do Pro gamers not use G-Sync?

Post by jorimt » 13 Aug 2020, 14:54

speancer wrote:
13 Aug 2020, 14:15
You can't deny that no sync is superior overall, with the least possible lag that can be achieved. There is a small difference between G-SYNC and no sync with no fps limit.
I don't think I denied that in any of my previous posts here. I was only referring to G-SYNC vs. no sync with framerates inside the refresh rate.

No sync is indeed "superior" to G-SYNC in raw input lag with framerates well above the refresh rate (even if not by very much at 240Hz+, and my graphs are generous to no sync due to counting first reaction instead of middle screen, where the differences would be more equalized), but the gap between it and G-SYNC closes as the max refresh rate increases due to the ever shrinking duration of individual scanout cycles.

I've mentioned this many times before, but once we hit 1000Hz, there will virtually be no difference between G-SYNC and no sync, at which point you really won't need any syncing method to prevent visible tearing artifacts.

G-SYNC is ultimately a tearing-prevention stopgap due to currently achievable framerate/refresh rate ratios, much like strobing is to motion clarity. Both will ultimately be circumvented and replace by ultra high refresh rates + framerates (be it amplifications techniques and/or native performance) in the future.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
speancer
Posts: 241
Joined: 03 May 2020, 04:26
Location: EU

Re: Why do Pro gamers not use G-Sync?

Post by speancer » 13 Aug 2020, 22:06

jorimt wrote:
13 Aug 2020, 14:54
speancer wrote:
13 Aug 2020, 14:15
You can't deny that no sync is superior overall, with the least possible lag that can be achieved. There is a small difference between G-SYNC and no sync with no fps limit.
I don't think I denied that in any of my previous posts here. I was only referring to G-SYNC vs. no sync with framerates inside the refresh rate.
I've never said you denied that, I just meant it's rather undeniable in general :P
Chief Blur Buster wrote:
25 Apr 2017, 19:28
Personally, I have no love for VSYNC OFF stutter-wise but it reigns supreme in input lag. Until we have 1000Hz gaming monitors, VSYNC OFF probably is still going to continue to reign supreme in PC-gaming eSports.
Main display (TV/PC monitor): LG 42C21LA (4K 120 Hz OLED / WBE panel)
Tested displays: ASUS VG259QM/VG279QM [favourite LCD FPS display] (280 Hz IPS) • Zowie XL2546K/XL2540K/XL2546 (240 Hz TN DyAc) • Dell S3222DGM [favourite LCD display for the best blacks, contrast and panel uniformity] (165 Hz VA) • Dell Alienware AW2521HFLA (240 Hz IPS) • HP Omen X 25f (240 Hz TN) • MSI MAG251RX (240 Hz IPS) • Gigabyte M27Q (170 Hz IPS) • Acer Predator XB273X (240 Hz IPS G-SYNC) • Acer Predator XB271HU (165 Hz IPS G-SYNC) • Acer Nitro XV272UKV (170 Hz IPS) • Acer Nitro XV252QF (390 Hz IPS) • LG 27GN800 (144 Hz IPS) • LG 27GL850 (144 Hz nanoIPS) • LG 27GP850 (180 Hz nanoIPS) • Samsung Odyssey G7 (240 Hz VA)

OS: Windows 11 Pro GPU: Palit GeForce RTX 4090 GameRock OC CPU: AMD Ryzen 7 7800X3D + be quiet! Dark Rock Pro 4 + Arctic MX-6 RAM: 32GB (2x16GB dual channel) DDR5 Kingston Fury Beast Black 6000 MHz CL30 (fully optimized primary and secondary timings by Buildzoid for SK Hynix die on AM5 platform) PSU: Corsair RM1200x SHIFT 1200W (ATX 3.0, PCIe 5.0 12VHPWR 600W) SSD1: Kingston KC3000 1TB NVMe PCIe 4.0 x4 SSD2: Corsair Force MP510 960GB PCIe 3.0 x4 MB: ASUS ROG STRIX X670E-A GAMING WIFI (GPU PCIe 5.0 x16, NVMe PCIe 5.0 x4) CASE: be quiet! Silent Base 802 Window White CASE FANS: be quiet! Silent Wings 4 140mm PWM (3x front, 1x rear, 1x top rear, positive pressure) MOUSE: Logitech G PRO X Superlight (white) Lightspeed wireless MOUSEPAD: ARTISAN FX HIEN (wine red, soft, XL) KEYBOARD: Logitech G915 TKL (white, GL Tactile) Lightspeed wireless HEADPHONES: Sennheiser Momentum 4 Wireless (white) 24-bit 96 KHz + Sennheiser BTD600 Bluetooth 5.2 aptX Adaptive CHAIR: Herman Miller Aeron (graphite, fully loaded, size C)

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Why do Pro gamers not use G-Sync?

Post by jorimt » 13 Aug 2020, 22:24

speancer wrote:
13 Aug 2020, 22:06
I've never said you denied that, I just meant it's rather undeniable in general :P
Gotcha, and yup, I said as much in my original article over three years ago...

"G-SYNC 101: G-SYNC vs. V-SYNC OFF":
https://blurbusters.com/gsync/gsync101- ... ettings/9/
So, for competitive players, V-SYNC OFF still reigns supreme in the input lag realm, especially if sustained framerates can exceed the refresh rate by 5x or more.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Faskill
Posts: 10
Joined: 19 Aug 2020, 13:53

Re: Why do Pro gamers not use G-Sync?

Post by Faskill » 20 Aug 2020, 07:22

Is there any use in using GSync when playing at fps significantly below the monitor frequency?
For example when playing at 150fps on a 240Hz monitor. This is only my first hand impressions but I feel like my monitor is more reactive while it’s not trying to match its frequency to my lower fps count (GSync Off)

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Why do Pro gamers not use G-Sync?

Post by jorimt » 20 Aug 2020, 08:33

Faskill wrote:
20 Aug 2020, 07:22
Is there any use in using GSync when playing at fps significantly below the monitor frequency?
For example when playing at 150fps on a 240Hz monitor.
That's G-SYNC primary intended use. It's what it was originally created for.
Faskill wrote:
20 Aug 2020, 07:22
This is only my first hand impressions but I feel like my monitor is more reactive while it’s not trying to match its frequency to my lower fps count (GSync Off)
Your only "reactive" advantage over G-SYNC with G-SYNC off in that range is the visible tearing; when it tears, there's your one and only "advantage." Otherwise, there's no difference between the two performance/input lag-wise during normal operation.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

pellson
Posts: 4
Joined: 14 Dec 2020, 20:23

Re: Why do Pro gamers not use G-Sync?

Post by pellson » 14 Dec 2020, 20:31

jorimt wrote:
20 Aug 2020, 08:33
Faskill wrote:
20 Aug 2020, 07:22
Is there any use in using GSync when playing at fps significantly below the monitor frequency?
For example when playing at 150fps on a 240Hz monitor.
That's G-SYNC primary intended use. It's what it was originally created for.
Faskill wrote:
20 Aug 2020, 07:22
This is only my first hand impressions but I feel like my monitor is more reactive while it’s not trying to match its frequency to my lower fps count (GSync Off)
Your only "reactive" advantage over G-SYNC with G-SYNC off in that range is the visible tearing; when it tears, there's your one and only "advantage." Otherwise, there's no difference between the two performance/input lag-wise during normal operation.
This is not true.
I have the Vg27aq and it 100% has an extremely small increase in input lag just by enabling the VRR in the OSD. No, I don't want to hear "that's because you have to cap fps beneath max bla bla" I know all this stuff already how vsync works.

I can instantly feel my mouse get a bit more responsive in windows by disabling it in the osd. But it's so small most wont notice I guess, but I drove me insane coming from the vg258qe, maybe its 3ms but noticible for an ocd guy like me.

Also, rtings mentions this on their review for the vg27aql1.

Vrr enabled: 5.7 ms
Vrr disabled: 3.3ms

Finally after months of ONLY reading comments like "there is no diffrence, you hare hitting vsycn roof, cap your fps bla bla" i have found someone who agrees with me.

Sorry for bad English, and also writing on mobile.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Why do Pro gamers not use G-Sync?

Post by Chief Blur Buster » 14 Dec 2020, 22:06

jorimt wrote:
20 Aug 2020, 08:33
Your only "reactive" advantage over G-SYNC with G-SYNC off in that range is the visible tearing; when it tears, there's your one and only "advantage." Otherwise, there's no difference between the two performance/input lag-wise during normal operation.
pellson wrote:
14 Dec 2020, 20:31
I have the Vg27aq and it 100% has an extremely small increase in input lag just by enabling the VRR in the OSD. No, I don't want to hear "that's because you have to cap fps beneath max bla bla" I know all this stuff already how vsync works.
So depending on interpretation/language barrier, both YOU *and* Jorim is right. ;)

Terminologically, language-barrier-wise, there isn't much difference between the two. It is in a matter of interpretation, given the English language (and its strange nuances), and there can be confusion that arises when the writer is non-English.

I think there's a misunderstanding. "When it tears, there's your one and only advantage" -- is the lower latency that jorim is agreeing with you on, as a tradeoff of the tearing.

There is a slight sync-technology-related average absolute latency penalty for VRR versus VSYNC OFF corresponding to one scanout halftime (e.g. 0.5/155sec latency penalty), given randomized tearline placement along the scanout.

The whole screen is a veritcal latency gradient that is typically [0....1/maxHz] on most gaming monitors, at least the ones that has realtime synchronization between cable scanout and panel scanout.

For esports-friendly VRR, I highly recommend higher Hz such as 240Hz or 360Hz, since the scanout halftime (0.5/360) begins to create an almost-negligible VRR-induced scanout latency penalty.

That said, VRR is the world's lowest "non-vsync-OFF" sync technology, generally lower lag than everything else other than VSYNC OFF.

Not all pixels on a display panel refresh at the same time. Both VSYNC OFF and VRR are same latency for scanline number #001 (top edge for VRR, and first scanline of a frameslice for VSYNC OFF). During VSYNC OFF, each frameslice is a latency gradient of [0...frametime] milliseconds, while for any other sync technology, the whole screen surface is a [0...1/maxHz] latency gradient. So the beginning of the latency gradient is identical (+0 adder), but the beginning of the latency gradient (+0) can be closer to the center of the screen, thanks to a tearline occuring right above crosshairs, for example. That's the advantage that tearing gives you.

However, given over 180 countries read Blur Busters and many people's languages aren't the same, and not everyone realizes that latency is not a single number -- it's true we sometimes need to give a bit more detailed answers nowadays. Given devices such as LDAT and other testers often don't account for the latency gradient effect, as single-pixel photodiode tests aren't good at measuring latency gradients of a random VSYNC OFF frameslice, but it is very visible in things like high speed videos (of Slo Mo Guys fame) or when putting multiple photodiodes simultaneously all over the screen.

First, screens refresh from top-to-bottom, as seen in high speed videos, which creates interesting behaviours for GSYNC versus VSYNC OFF.

Example of VSYNC OFF sub-divided latency gradients

Image

Scanout: GSYNC 100fps at 144Hz

Image

Scanout: VSYNC OFF 100fps at 144Hz

Image

As you can see, the latency gradients are different [0...1/maxHz] versus [0...frametime] for VRR versus VSYNC-OFF. So minimum latency adder (+0) is identical for both (Jorim is correct), but maximum is different (pellson is correct).

Remember: Not All Pixels Refresh At The Same Time!!
Remember: Not All Pixels Are The Same Latency!!

pellson wrote:
14 Dec 2020, 20:31
Sorry for bad English, and also writing on mobile.
Many forum members here are not English-language-first.

I'm kind of a stickler about terminology around here to minimize disagreements, given the international nature of our audience, since latency is not a single number, nor for each pixel on the entire screen surface.

VSYNC OFF is more responsive for esports use because of its reliance on lower average latency (midpoint MIN...MAX) at the cost of motion fluidity / tearing. For average absolute latency, VSYNC OFF is hard to beat (especially great in games such as CS:GO), while G-SYNC (at extreme refresh rates) can still help give advantage in different games that are stutter-prone enough to interfere with aiming.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

pellson
Posts: 4
Joined: 14 Dec 2020, 20:23

Re: Why do Pro gamers not use G-Sync?

Post by pellson » 16 Dec 2020, 15:27

Chief Blur Buster wrote:
14 Dec 2020, 22:06
jorimt wrote:
20 Aug 2020, 08:33
Your only "reactive" advantage over G-SYNC with G-SYNC off in that range is the visible tearing; when it tears, there's your one and only "advantage." Otherwise, there's no difference between the two performance/input lag-wise during normal operation.
pellson wrote:
14 Dec 2020, 20:31
I have the Vg27aq and it 100% has an extremely small increase in input lag just by enabling the VRR in the OSD. No, I don't want to hear "that's because you have to cap fps beneath max bla bla" I know all this stuff already how vsync works.
So depending on interpretation/language barrier, both YOU *and* Jorim is right. ;)

Terminologically, language-barrier-wise, there isn't much difference between the two. It is in a matter of interpretation, given the English language (and its strange nuances), and there can be confusion that arises when the writer is non-English.

I think there's a misunderstanding. "When it tears, there's your one and only advantage" -- is the lower latency that jorim is agreeing with you on, as a tradeoff of the tearing.

There is a slight sync-technology-related average absolute latency penalty for VRR versus VSYNC OFF corresponding to one scanout halftime (e.g. 0.5/155sec latency penalty), given randomized tearline placement along the scanout.

The whole screen is a veritcal latency gradient that is typically [0....1/maxHz] on most gaming monitors, at least the ones that has realtime synchronization between cable scanout and panel scanout.

For esports-friendly VRR, I highly recommend higher Hz such as 240Hz or 360Hz, since the scanout halftime (0.5/360) begins to create an almost-negligible VRR-induced scanout latency penalty.

That said, VRR is the world's lowest "non-vsync-OFF" sync technology, generally lower lag than everything else other than VSYNC OFF.

Not all pixels on a display panel refresh at the same time. Both VSYNC OFF and VRR are same latency for scanline number #001 (top edge for VRR, and first scanline of a frameslice for VSYNC OFF). During VSYNC OFF, each frameslice is a latency gradient of [0...frametime] milliseconds, while for any other sync technology, the whole screen surface is a [0...1/maxHz] latency gradient. So the beginning of the latency gradient is identical (+0 adder), but the beginning of the latency gradient (+0) can be closer to the center of the screen, thanks to a tearline occuring right above crosshairs, for example. That's the advantage that tearing gives you.

However, given over 180 countries read Blur Busters and many people's languages aren't the same, and not everyone realizes that latency is not a single number -- it's true we sometimes need to give a bit more detailed answers nowadays. Given devices such as LDAT and other testers often don't account for the latency gradient effect, as single-pixel photodiode tests aren't good at measuring latency gradients of a random VSYNC OFF frameslice, but it is very visible in things like high speed videos (of Slo Mo Guys fame) or when putting multiple photodiodes simultaneously all over the screen
Thanks for a well explained post ❤️
But I'm not talking about when VRR is engaged (fps below max hz) . I run all my games fps well above the max hz of my monitor, so the vrr should be disengaged att all times. But still, I notice a very very small, but enough, by using the cursor in Windows or playing CSGO.

I hope you understand what I mean, this mostly messes with me in CSGO where I have 400fps and have to play competetvily.

In single player games I couldn't care this much about such a small input lag, hell, I play then with vsync :)

Sorry its hard to explain, but I'm talking about in all scenarios vsync off, 400fps and the only diffrence is that VRR is disabled in the monitor OSD.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Why do Pro gamers not use G-Sync?

Post by jorimt » 16 Dec 2020, 15:52

pellson wrote:
16 Dec 2020, 15:27
I'm talking about in all scenarios vsync off, 400fps and the only diffrence is that VRR is disabled in the monitor OSD.
Then when you replied to me "This is not true," you're misunderstanding my previous posts, because I was referring to G-SYNC mode on/off within the refresh rate, and more specifically, to native G-SYNC monitors.

On native G-SYNC monitors, the module replaces the panel scaler, so whether G-SYNC mode is on or off, the signal goes through the same component.

With G-SYNC Compatible FreeSync monitors, however, there is no module, just the panel scaler, and VRR implementation is predominately handled by the display driver itself on the GPU-side instead.

As such, it's theoretically possible there could be an input lag difference with VRR on/off (unrelated to sync-induced lag) with certain FreeSync displays, it just hasn't been formally or thoroughly tested.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

pellson
Posts: 4
Joined: 14 Dec 2020, 20:23

Re: Why do Pro gamers not use G-Sync?

Post by pellson » 16 Dec 2020, 19:03

jorimt wrote:
16 Dec 2020, 15:52
pellson wrote:
16 Dec 2020, 15:27
I'm talking about in all scenarios vsync off, 400fps and the only diffrence is that VRR is disabled in the monitor OSD.
Then when you replied to me "This is not true," you're misunderstanding my previous posts, because I was referring to G-SYNC mode on/off within the refresh rate, and more specifically, to native G-SYNC monitors.

On native G-SYNC monitors, the module replaces the panel scaler, so whether G-SYNC mode is on or off, the signal goes through the same component.

With G-SYNC Compatible FreeSync monitors, however, there is no module, just the panel scaler, and VRR implementation is predominately handled by the display driver itself on the GPU-side instead.

As such, it's theoretically possible there could be an input lag difference with VRR on/off (unrelated to sync-induced lag) with certain FreeSync displays, it just hasn't been formally or thoroughly tested.
You're right, I'm sorry.
I'm using VG27AQ which uses FreeSync.
I'm a little curious why nobody seems to acknowledge this phenomen. This thread was the closest I found after hours of searching. Two of my friends couldn't feel a diffrence at all, but I certainly feel its a bit sloppier in fast fps games with VRR On in OSD.

Sorry for coming across as an asshole, I just wanted to be understood after all people telling me "just turn off Vsync" when that's not even what I'm talking about.

Post Reply