Blur Busters Forums

Who you gonna call? The Blur Busters! For Everything Better Than 60Hz™ Skip to content

[Thread Superseded] G-Sync 101 w/Chart (WIP)

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag.

Re: G-Sync 101 w/Chart (WIP)

Postby Sparky » 07 May 2017, 03:38

kurtextrem wrote:The setting was discussed a few pages back, setting it to 1 introduces uneven frame times.

Reducing the amount of buffering between the game engine and the GPU does that, it's the price you pay for lower latency.
Sparky
 
Posts: 530
Joined: 15 Jan 2014, 02:29

Re: G-Sync 101 w/Chart (WIP)

Postby RealNC » 07 May 2017, 08:16

No uneven frame rates here. If I don't set it to 1, I get higher input lag with g-sync. I've set it to 1 in the global profile.

It affects all cases and all frame limiters. If FPS is not currently actively capped, input lag spikes when not using 1. In other words, when you're capped (100FPS at 100FPS cap), there's no lag. Once the game falls to 96FPS, there's lag.

So setting this to 1 globally is highly recommended, otherwise you're going to get input lag spikes at random.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
User avatar
RealNC
 
Posts: 1234
Joined: 24 Dec 2013, 18:32

Re: G-Sync 101 w/Chart (WIP)

Postby kurtextrem » 07 May 2017, 11:16

The post I was refering to: viewtopic.php?f=5&t=3073&start=80#p24438
Asus PG248Q - Competitive Gamer
User avatar
kurtextrem
 
Posts: 22
Joined: 05 Mar 2017, 03:35
Location: Munich, Germany

Re: G-Sync 101 w/Chart (WIP)

Postby RealNC » 07 May 2017, 12:40

I can not confirm that finding. Setting it to 1 does not introduce uneven frame times for me.

Maybe if we could find a specific game that seems to behave badly, we could do some tests. But I have not found a game yet where 1 has any ill effects. The reverse is actually true; not setting it to 1 results in input lag when the game falls below the frame rate cap. So it goes from "snappy mouse" to "floaty mouse" during low-FPS situations, which is quite annoying. Having high variance in input lag is very noticeable for me.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
User avatar
RealNC
 
Posts: 1234
Joined: 24 Dec 2013, 18:32

Re: G-Sync 101 w/Chart (WIP)

Postby RealNC » 07 May 2017, 14:46

Not sure if mentioned, but BNS tested laptop vs desktop g-sync, and found that on laptops, g-sync increases input lag very slightly (3ms maximum):

https://www.youtube.com/watch?v=ADYzuMe17q8
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
User avatar
RealNC
 
Posts: 1234
Joined: 24 Dec 2013, 18:32

Re: G-Sync 101 w/Chart (WIP)

Postby jorimt » 07 May 2017, 15:01

Regarding Max pre-rendered frames "1," uhg, this is why I have a hesitancy to test it; evident by just the past few posts in this thread, it simply depends. CPU, GPU, game, sustained framerate, you name it, it varies. But again, in my experience, it will make the CPU work harder, and that will cause more frametime spikes, at least in games weak to those instances in the first place (mostly open-world with asset loading systems).

And the setting isn't even directly connected to sync-induced input latency. It's a CPU-dependent setting, predominately. And since I'm testing for the input latency difference between syncing methods, it simply doesn't factor in there.

I'm currently counting through 2880 samples that will account for only a portion of samples featured in my upcoming input latency article. However, if I can fit in a brief test of that setting, I will, for one refresh rate at least.

@RealNC, The Witcher 3 has an asset streaming system that is prone to creating frequent frametime spikes (at least on initial load), and I've found Max pre-rendered frames "1" causes more frequent and longer spikes in the same areas as opposed to when it is set to default. Mileage may vary at all that though.
Blur Busters Contributor - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter!
User avatar
jorimt
 
Posts: 394
Joined: 04 Nov 2016, 10:44

Re: G-Sync 101 w/Chart (WIP)

Postby RealNC » 07 May 2017, 15:11

jorimt wrote:@RealNC, The Witcher 3 has an asset streaming system that is prone to creating frequent frametime spikes (at least on initial load), and I've found Max pre-rendered frames "1" causes more frequent and longer spikes in the same areas as opposed to when it is set to default. Mileage may vary at all that though.

I have logged about 200 hours of W3 playtime :-)

No issues to report with MPRF 1. I have observed no difference in FPS fluctuations when using 1 vs 8. The only difference is the huge input lag of using 8.

But even if there were FPS fluctuations, if you think about it, frametime spikes are preferable to input lag increase when using g-sync. G-sync makes FPS fluctuations a non-issue. It can't help when input lag suddenly doubles though.

Btw, making the GPU work harder is a good thing. If it doesn't work at its maximum, then that means a bottleneck exists somewhere.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
User avatar
RealNC
 
Posts: 1234
Joined: 24 Dec 2013, 18:32

Re: G-Sync 101 w/Chart (WIP)

Postby jorimt » 07 May 2017, 15:20

I've got about 250 hours in it myself ;)

No, it's true input response is zero during frametime spikes, so your character can't respond to your input anyway, but increase in frametime spikes is worth considering if the system or game in question causes more than it would if MPRF were at the default setting.

That setting isn't an easy "yes" or "no" answer, and the only reason I recommend default over "1," is so that the layman doesn't get frustrated at the "!@%$ performance" when he tries to max out a current triple A game with MPRF "1" on a toaster. But then again, he'd have more problems than what MPRF is set to in that case :P
Blur Busters Contributor - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter!
User avatar
jorimt
 
Posts: 394
Joined: 04 Nov 2016, 10:44

Re: G-Sync 101 w/Chart (WIP)

Postby RealNC » 07 May 2017, 15:36

Hm, here's a thing I forgot to mention, since it's been a while, as this is from my "good ol' vsync days" without a g-sync monitor.

MPRF@default would sometimes result in "uneven stutter" in some games. For example, imagine the game running at 60FPS for 300ms, then at 40FPS for 300ms, then 60, then 50, then 60 again, then 30, etc. The way I could tell is because once you get that "sweet 60FPS@60Hz vsync lock", you immediately know. Even if that happens for only a fraction of a second, you can tell.

Switching to MPRF 1 would result in "even stutter". That is, 40FPS sustained (or 50, or whatever.) Whatever FPS the PC was able to produce at any given time, that would be what I'm getting. Setting MPRF to anything higher than 1 seemed to make the driver or the game trying to compensate for the low framerate. And the result was visually bad.

Now, I don't know if this is subjective or not, but to me, the "even stutter" was preferable to this "severe hiccup stutter" I was getting with MPRF@default. Games would seem to "buffer up" a couple frames, play them smoothly, then hiccup as the buffers got empty, rinse and repeat. MPRF@1 would give me what I call "good stutter" (an oxymoron, I know), where at least things were consistent, not a mix of "stutter-smooth-stutter-smooth".

I suppose my MPRF@1 experience would on paper look like sustained frame time spiking, while MPRF@default would make the spikes seem shorter. But in reality, the former had a much less offensive visual appearance than the latter. If the GPU or CPU (whatever the bottleneck was) isn't able to sustain a high frame rate, the delivery method of the frames it CAN sustain matters. And that delivery is visually less unpleasant (at least for me) at MPRF@1 compared to MPRF@default. In a 40FPS situation, I prefer getting 40 frames evenly distributed across 1000ms (MPRF 1) vs getting 30 frames during the first 500ms (so that's 60FPS for half a second) and then getting 10 frames for the next 200ms, and then whatever's left for the last 300ms.

The visual appearance of MPRF 1 during low FPS situations is less offensive to my eyes compared to MPRF@default.

However, all of the above was without g-sync. Just vsync. I'm not sure how g-sync changes the situation here.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
User avatar
RealNC
 
Posts: 1234
Joined: 24 Dec 2013, 18:32

Re: G-Sync 101 w/Chart (WIP)

Postby lexlazootin » 08 May 2017, 05:46

Just noticed that if you force V-Sync off in NVCP you get tearing from the top of the screen if you're fps goes below double framerate mode. (45fps~)

Found this out speedrunning HL, when i set my fps to 5 it would tear all over :P
lexlazootin
 
Posts: 909
Joined: 16 Dec 2014, 02:57

PreviousNext

Return to G-SYNC

Who is online

Users browsing this forum: No registered users and 5 guests