[Thread Superseded] G-Sync 101 w/Chart (WIP)

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
Locked
User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: G-Sync 101 w/Chart (WIP)

Post by Chief Blur Buster » 30 Mar 2017, 11:36

kurtextrem wrote:imo your benchmarks show a (very tiny) difference between gsync + vsync off/on. If you can live with the tearing at the very bottom, why not take it?
On the bottom there's nothing too important, where tearing may influence your sights.

This is also why I use prerendered virtual reality frames: 1 (not sure if it even does something when not using VR stuff, but whatever), max prerendered frames: 1 and prefered refreshrate: highest available. Every frame counts, Fps stability is up to your system (close expensive programs while gaming, use Bitsums highest performance plan and so on)
Yes.... At 144fps@144Hz, the difference is much more tiny.

Still.... It's better to have a consistent lag, than a lag that varies suddenly up/down -- it throws off your aiming -- One prefers a consistent lag than a lag that varies.

On other monitors, e.g. 60Hz or 12But there are also monitors that exist that can only do up to 60fps G-SYNC (e.g. 4K 60Hz G-SYNC monitors or 100Hz G-SYNC Ultrawides), so the lag increase is bigger. Some people feel it much more than others.

---

On another topic -- non-GSYNC -- FPS stability is extremely important too if you use strobe modes (e.g. ULMB) -- strobing kind of needs VR-quality frame rate stability to look really good, as ULMB makes microstutters more visible than even non-GSYNC non-ULMB.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: G-Sync 101 w/Chart (WIP)

Post by jorimt » 30 Mar 2017, 11:47

Regarding the conversation taking place over the past couple of pages...

Again, G-Sync + v-sync off is basically a really fancy adaptive sync. And while there are advantages to it over plain old v-sync off (e.g. a game where your average fps is above the maximum refresh of the monitor, but sometimes drops), it's still gonna tear.

For those who are fully aware of this, and want it anyway, great. But for the less informed expecting it to be a tear-free experience like G-Sync + v-sync on is, well, for that reason, the G-Sync + v-sync off option has triggered massive confusion, and has created claims (in the less informed circles) of G-Sync being broken ever since.

Bottom line, G-Sync's aim is to display a single frame at a time, and to accomplish this, it sometimes must add a little delay (1-3 ms on average in my testing, if you only count bottom-of-screen updates; middle screen delivery is identical across scenarios) to some frames in order to phase delivery to exactly the height of the given screen. If it doesn't, you get simultaneous delivery of more than one frame, thus tearing.

G-Sync + v-sync on is the only scenario that offers a 100% tear-free experience, and for the majority of gamers, that 1-3 extra ms on average (again, counting bottom-of-screen updates only) is worth it for a (sync-induced) stutter/tear-free experience.

For competitive gamers who want to reduce latency by 1-3ms, and can sustain frames above the max refresh of the display at all times, no syncing is still the way to go.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

akirru
Posts: 7
Joined: 15 Mar 2017, 08:23

Re: G-Sync 101 w/Chart (WIP)

Post by akirru » 30 Mar 2017, 19:04

Thanks for such an informative post jorimt ... If only Nvidia could be so clear with their choice of words then we wouldn't be in this predicament :)

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: G-Sync 101 w/Chart (WIP)

Post by jorimt » 31 Mar 2017, 16:52

Thanks akirru, I'll be aggregating some of the information in my previous posts on this thread for use in my ongoing "G-Sync 101" article on blurbusters.com (http://www.blurbusters.com/gsync/gsync101/). Stay tuned...
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

MT_
Posts: 113
Joined: 17 Jan 2017, 15:39

Re: G-Sync 101 w/Chart (WIP)

Post by MT_ » 01 Apr 2017, 10:51

kurtextrem wrote:imo your benchmarks show a (very tiny) difference between gsync + vsync off/on. If you can live with the tearing at the very bottom, why not take it?
On the bottom there's nothing too important, where tearing may influence your sights.

This is also why I use prerendered virtual reality frames: 1 (not sure if it even does something when not using VR stuff, but whatever), max prerendered frames: 1 and prefered refreshrate: highest available. Every frame counts, Fps stability is up to your system (close expensive programs while gaming, use Bitsums highest performance plan and so on)
Same here, highest clock speeds and stuff will definately stabilize the tearing more, as there is no ramp up time for the processor etc, or parked cores.

But Bitsum highest performance, and even standard High Performance profile only works to keep your clock at maximum at all times if Windows has exclusive control over Speedstep, if you have C1E and other Power states active it really does nothing.

I actually found (on my kaby lake) to disable everything except C1E. Was way more reliable keeping turboboost in games at max clock at all times. Jumps from 800 to 3800 and instantly to 4900mhz ;)

I think the additional thing bitsum plan does is disable core parking, but im pretty sure OS core parking only parks HT virtual cores assuming you have those. And hardware C states that can disable real cores and cut voltage.
LTSC 21H2 Post-install Script
https://github.com/Marctraider/LiveScript-LTSC-21H2

System: MSI Z390 MEG Ace - 2080 Super (300W mod) - 9900K 5GHz Fixed Core (De-lid) - 32GB DDR3-3733-CL18 - Xonar Essence STX II

User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: G-Sync 101 w/Chart (WIP)

Post by RealNC » 01 Apr 2017, 11:01

The difference is within margin of error. In this case, "margin of error" means the average input lag cost is 0.x milliseconds, since 1ms is the camera's resolution (1000FPS), and it can't be measured accurately. All we can say it's an average of 0.x ms, but we can place no actual value on that "x".

From a statistical analysis perspective, when you hear "margin of error", what that means is that even something very small can show up as very big. Even if the actual value is 0.3, it can show at 1.0 in the results if your resolution is 1.0.

In any event, it's safe to assume that it's not an amount of input lag that would actually matter.

Also, even if it is 1 "whole millisecond", it's still not important. 1ms makes no difference whatsoever to anything.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: G-Sync 101 w/Chart (WIP)

Post by Chief Blur Buster » 01 Apr 2017, 11:46

RealNC wrote:Also, even if it is 1 "whole millisecond", it's still not important. 1ms makes no difference whatsoever to anything.
Although mostly meaningless, it still makes an extremely tiny difference in the Olympics "cross-the-finish-line" effects. It's still possible for a 0.5ms difference to cross the Olympics finish line first.

Now the gaming equivalent of Olympics: Given an ultra-low-jitter ultra-high-performance system, and well matched reaction times in the top-league eSports (Human reaction time benchmarks 150ms vs 151ms vs 152.5ms vs 153ms etc) -- such a narrow spread of reaction times on well optimized systems (GTX Titans on 1080p240Hz on older engines like CS:GO) -- thereupon, a difference of 0.5ms can make a statistical difference in "draw" situations (shoot each other at same time). It's often lost in all the statistical noise of various competition gameplay tactics and skills, chance appearances of refresh cycles before each other (due to 4.1ms granularity of 240Hz and 8.3ms granularity of 120Hz) -- and scientific studies have only focussed on amateur competition gamers rather than the paid professional leagues. However, mathematically, 0.5ms can still give you an advantage at the world championship type level due to the "Cross-the-finish-line-effects" of "same-draw" situations.

While milliseconds definitely doesn't matter around here for players like you and me -- the point is we can't make an eSports player hate Blur Busters because they say miliseconds doesn't matter. Even if you can't feel the millisecond, you can still win by that. (So beware to that Blur Busters Fans who attends eSports leagues and then shouting to a world champion eSports player "milliseconds doesnt matter"...you're going to get (perhaps rightfully) attacked back by them... Because milliseconds do statistically over there, as explained above, regardless of opinion)

But yes, we are hitting real diminishing points of returns.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: G-Sync 101 w/Chart (WIP)

Post by RealNC » 01 Apr 2017, 12:15

Professional-level "1ms counts" people won't be using g-sync though. On LAN tournaments they have PCs that reliably push upwards of 400FPS on CS:GO. When there's a tournament where players are given machines that can't push upwards of 300FPS, you usually see a sh*tstorm coming up about it on their twitter accounts and/or reddit.

As a side-note, the 0.5ms difference doesn't count for "draw" situations, unless it's within the corner case where that 0.5ms difference got you into the previous tick of the server's main loop. For CS:GO, that's 128 ticks per second, so that's a granularity of 7.8ms. If your input is within that 7.8ms of another player, then you firing first doesn't actually count. If you fire at the 2ms mark and your opponent at the 6ms mark, you can still die even though you fired earlier. This is not something that can be time-stamped, since the server can only rely on its own clock. The server uses arbitrary rules to decide on how to resolve "draws." It's suspected that it's a fixed list of players and the priority is given to whomever as at the top of the list.

A very interesting (well, IMO) video where this has been analyzed is this:

https://www.youtube.com/watch?v=_oWNIifrpYU

Not that any of the above matters, and it's slightly off-topic, but it's fun to talk about :)
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: G-Sync 101 w/Chart (WIP)

Post by jorimt » 01 Apr 2017, 13:52

No doubt the lower the input latency on the client side, the better.

But, as RealNC already mentioned, I think the ultimate deciding factor in online multiplayer is how accurate a given game's server lag compensation is. Some are better than others. I've seen video tests for certain games *cough* COD *cough* which make you wonder if the player who technically shoots "first" even wins the battle in most cases.

I'm sure it varies.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

cokeguy223
Posts: 2
Joined: 01 Apr 2017, 14:52

Re: G-Sync 101 w/Chart (WIP)

Post by cokeguy223 » 01 Apr 2017, 14:57

Hey guys I appreciate the article.

I have a 1080 TI + PG279Q that I just purchased.

I purchased it for a game I mainly play called Black Desert Online. This game is wildly unoptimized and framerates can go from 0 to 200+ at any given moment. I was wondering if you guys had any recommended settings for a game like this? I've tried Vsync ON, OFF, Fast. All of them produce quite a bit of microstuttering. I'm not feeling the benefits of G-Sync at all.

Locked