Page 16 of 33
Re: G-Sync 101 w/Chart (WIP)
Posted: 29 Mar 2017, 23:15
by Haste
RealNC wrote:akirru wrote:So would you say for the most part unless you are some sort of pro gamer ... it might be best to go with G-sync + Vsync on and no cap?
If you are a pro gamer, you don't use g-sync.
It does not make sense to disable vsync when using g-sync. I don't even know why people wanted this in the first place.
IMO, it's because people are generally completely clueless and don't know what they actually want

Why? What potential detrimental effect would gsync have for competitive gamers, if vsync is off anyway?

Re: G-Sync 101 w/Chart (WIP)
Posted: 29 Mar 2017, 23:45
by RealNC
Haste wrote:Why? What potential detrimental effect would gsync have for competitive gamers, if vsync is off anyway?

It means it doesn't get activated. Your average pro player, plays at about 300FPS. There's no 300Hz monitors. G-Sync is always inactive.
It doesn't make sense to enable G-Sync in that case. Why are you enabling it if you don't want it? If you wanted it, you'd be capping your frame rate to about 3FPS lower than your Hz. And for that, you need vsync "on" in NVCP to avoid bottom tearing. Since you don't cap your FPS, it means you DON'T want G-Sync. So why are you enabling it in the first place then?
Again, makes zero sense to me

Re: G-Sync 101 w/Chart (WIP)
Posted: 30 Mar 2017, 01:23
by Haste
For whenever the frametime goes above 1/refresh rate.
Can (does) happen even in games that have 300+ average fps.
I see it as a way to slightly reduce the negative impact of those occurrences. Kind of a safety feature.
Re: G-Sync 101 w/Chart (WIP)
Posted: 30 Mar 2017, 01:46
by RealNC
Haste wrote:For whenever the frametime goes above 1/refresh rate.
Can (does) happen even in games that have 300+ average fps.
Why would you want g-sync in that case? Since the majority of the time you exceed the refresh rate and run vsync off, you get tearing. Why would you care about tearing for the case where your FPS falls below refresh? The only thing that changes with FPS>Hz vs FPS<Hz is the direction the tearline is moving. For example, assuming 100Hz, 110FPS has the same amount of tearing as 90FPS. The only difference is that in the 110FPS case the tearline is moving upwards, while in the 90FPS case it's moving downwards. Other than that, everything looks pretty much the same, so I have a hard time understanding why you would want the 90FPS case being synced and the 110FPS case being unsynced...
Also, letting gsync activate when FPS<Hz results in a bit of input lag kicking in, which makes things less consistent. (G-Sync only adds *little* input lag, but not zero!)
IMO, people who think G-Sync + vsync off is useful didn't think things through. Which I suspect is also why NVidia didn't initially even allow for that.
I've been playing CS:GO for years now since it came out (about 1700 hours total playtime), and I can say the most important thing is consistency. It's the only game I play with g-sync off vsync off. If instead I use gsync on vsync off and the game drops below 165FPS (the monitor is 165Hz,) it throws my aim off. It happens rarely, but when it happens, I either overshoot or undershoot my flicks. If, on the other hand, I use G-Sync on vsync on and "fps_max 160", it's better, because it's
consistent.
PS:
Or I might be missing something fundamental here... Can't be ruled out entirely

Re: G-Sync 101 w/Chart (WIP)
Posted: 30 Mar 2017, 01:58
by Chief Blur Buster
With framebuffering, there's the potential for a sudden-lag-increase effect whenever the GSYNC maximum rate is hit -- for low-refresh-rate GSYNC (e.g. 4K 60Hz) the sudden lag increase is one or two frames (usually one, but I've heard two) -- e.g. +17ms or +33ms. Using VSYNC OFF solves this, and prevent sudden input-lag-modulations transitioning between below-max GSYNC and maxed-out GSYNC when not using an in-game framerate cap. But, it somewhat defects the stutter-elimination purpose of GSYNC, as VSYNC OFF microstutters suddenly reappears after you hit maximum GSYNC rate.
User choice probably won out here, I would guess --
That said, if you're able to use an in-game framerate cap (usually the best kind!) then you ideally really don't need VSYNC ON/OFF when hitting the max rate. In-game framerate caps slightly below maximum VRR rate on either FreeSync/GSYNC is usually the best way -- we were the
first website in the world to lag-test GSYNC and was the first to discover that in-game frame-capping below GSYNC max rate, was a way to avoid the sudden lag increase problem during hitting GSYNC limit.
Re: G-Sync 101 w/Chart (WIP)
Posted: 30 Mar 2017, 01:59
by kurtextrem
imo your benchmarks show a (very tiny) difference between gsync + vsync off/on. If you can live with the tearing at the very bottom, why not take it?
On the bottom there's nothing too important, where tearing may influence your sights.
This is also why I use prerendered virtual reality frames: 1 (not sure if it even does something when not using VR stuff, but whatever), max prerendered frames: 1 and prefered refreshrate: highest available. Every frame counts, Fps stability is up to your system (close expensive programs while gaming, use Bitsums highest performance plan and so on)
Re: G-Sync 101 w/Chart (WIP)
Posted: 30 Mar 2017, 02:05
by Haste
RealNC wrote:Haste wrote:For whenever the frametime goes above 1/refresh rate.
Can (does) happen even in games that have 300+ average fps.
Why would you want g-sync in that case? Since the majority of the time you exceed the refresh rate and run vsync off, you get tearing. Why would you care about tearing for the case where your FPS falls below refresh? The only thing that changes with FPS>Hz vs FPS<Hz is the direction the tearline is moving. For example, assuming 100Hz, 110FPS has the same amount of tearing as 90FPS. The only difference is that in the 110FPS case the tearline is moving upwards, while in the 90FPS case it's moving downwards. Other than that, everything looks pretty much the same, so I have a hard time understanding why you would want the 90FPS case being synced and the 110FPS case being unsynced...
There is an important factor. The lower the frame rate, the bigger the offset between frames. Which is why tearing is more jarring with low frame rates and less jarring with high frame rates.
So you might not be bothered too much by tearing at 300fps but bothered at let's say ~100fps.
And then there is the matter of the way synchronization does ever so slightly reduce the judder/stutter perception even compared to vsync off. (It's nowhere near the dramatic improvement between gsync and vsync induced micro-stuttering, but its there)
RealNC wrote:
Also, letting gsync activate when FPS<Hz results in a bit of input lag kicking in, which makes things less consistent. (G-Sync only adds *little* input lag, but not zero!)
Ah. That surely is a very good point. And that could easily overshadow the benefits I described above.
I wasn't aware of that side effect of gsync.
RealNC wrote:
IMO, people who think G-Sync + vsync off is useful didn't think things through. Which I suspect is also why NVidia didn't initially even allow for that.
I think that feature was also to counter AMD. Since Freesync had it out of the box.
Re: G-Sync 101 w/Chart (WIP)
Posted: 30 Mar 2017, 02:15
by RealNC
Haste wrote:There is an important factor. The lower the frame rate, the bigger the offset between frames. Which is why tearing is more jarring with low frame rates and less jarring with high frame rates. So you might not be bothered too much by tearing at 300fps but bothered at let's say ~100fps.
That's not true, actually. The offset is limited to the height of the monitor, since there's only one tearline. The only thing that changes is the direction the tear is moving.
To reduce the offset, you need multiple tearlines. That means you need FPS >= 2*Hz. For 144Hz, the offset only decreases when you reach 288FPS (2*144).
Re: G-Sync 101 w/Chart (WIP)
Posted: 30 Mar 2017, 03:26
by Haste
Pan a camera horizontally at 1000 pixels per second:
- At 100fps the offset between frame n and frame n+1 will be 10 pixels
- at 300fps the offset between frame n and frame n+1 will be 3.33... pixels
Re: G-Sync 101 w/Chart (WIP)
Posted: 30 Mar 2017, 03:36
by RealNC
Haste wrote:Pan a camera horizontally at 1000 pixels per second:
- At 100fps the offset between frame n and frame n+1 will be 10 pixels
- at 300fps the offset between frame n and frame n+1 will be 3.33... pixels
Oops. I was having the distance between tearlines be the "offset" in my head. You're right, of course. The actual difference between the frames themselves gets higher as the frame rate gets lower.
I can see now why gsync on + vsync off makes sense to some people.