Why? What potential detrimental effect would gsync have for competitive gamers, if vsync is off anyway?RealNC wrote:If you are a pro gamer, you don't use g-sync.akirru wrote:So would you say for the most part unless you are some sort of pro gamer ... it might be best to go with G-sync + Vsync on and no cap?
It does not make sense to disable vsync when using g-sync. I don't even know why people wanted this in the first place.
IMO, it's because people are generally completely clueless and don't know what they actually want
[Thread Superseded] G-Sync 101 w/Chart (WIP)
Re: G-Sync 101 w/Chart (WIP)
Monitor: Gigabyte M27Q X
Re: G-Sync 101 w/Chart (WIP)
It means it doesn't get activated. Your average pro player, plays at about 300FPS. There's no 300Hz monitors. G-Sync is always inactive.Haste wrote:Why? What potential detrimental effect would gsync have for competitive gamers, if vsync is off anyway?
It doesn't make sense to enable G-Sync in that case. Why are you enabling it if you don't want it? If you wanted it, you'd be capping your frame rate to about 3FPS lower than your Hz. And for that, you need vsync "on" in NVCP to avoid bottom tearing. Since you don't cap your FPS, it means you DON'T want G-Sync. So why are you enabling it in the first place then?
Again, makes zero sense to me
Steam • GitHub • Stack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
Re: G-Sync 101 w/Chart (WIP)
For whenever the frametime goes above 1/refresh rate.
Can (does) happen even in games that have 300+ average fps.
I see it as a way to slightly reduce the negative impact of those occurrences. Kind of a safety feature.
Can (does) happen even in games that have 300+ average fps.
I see it as a way to slightly reduce the negative impact of those occurrences. Kind of a safety feature.
Monitor: Gigabyte M27Q X
Re: G-Sync 101 w/Chart (WIP)
Why would you want g-sync in that case? Since the majority of the time you exceed the refresh rate and run vsync off, you get tearing. Why would you care about tearing for the case where your FPS falls below refresh? The only thing that changes with FPS>Hz vs FPS<Hz is the direction the tearline is moving. For example, assuming 100Hz, 110FPS has the same amount of tearing as 90FPS. The only difference is that in the 110FPS case the tearline is moving upwards, while in the 90FPS case it's moving downwards. Other than that, everything looks pretty much the same, so I have a hard time understanding why you would want the 90FPS case being synced and the 110FPS case being unsynced...Haste wrote:For whenever the frametime goes above 1/refresh rate.
Can (does) happen even in games that have 300+ average fps.
Also, letting gsync activate when FPS<Hz results in a bit of input lag kicking in, which makes things less consistent. (G-Sync only adds *little* input lag, but not zero!)
IMO, people who think G-Sync + vsync off is useful didn't think things through. Which I suspect is also why NVidia didn't initially even allow for that.
I've been playing CS:GO for years now since it came out (about 1700 hours total playtime), and I can say the most important thing is consistency. It's the only game I play with g-sync off vsync off. If instead I use gsync on vsync off and the game drops below 165FPS (the monitor is 165Hz,) it throws my aim off. It happens rarely, but when it happens, I either overshoot or undershoot my flicks. If, on the other hand, I use G-Sync on vsync on and "fps_max 160", it's better, because it's consistent.
PS:
Or I might be missing something fundamental here... Can't be ruled out entirely
Steam • GitHub • Stack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
- Chief Blur Buster
- Site Admin
- Posts: 11653
- Joined: 05 Dec 2013, 15:44
- Location: Toronto / Hamilton, Ontario, Canada
- Contact:
Re: G-Sync 101 w/Chart (WIP)
With framebuffering, there's the potential for a sudden-lag-increase effect whenever the GSYNC maximum rate is hit -- for low-refresh-rate GSYNC (e.g. 4K 60Hz) the sudden lag increase is one or two frames (usually one, but I've heard two) -- e.g. +17ms or +33ms. Using VSYNC OFF solves this, and prevent sudden input-lag-modulations transitioning between below-max GSYNC and maxed-out GSYNC when not using an in-game framerate cap. But, it somewhat defects the stutter-elimination purpose of GSYNC, as VSYNC OFF microstutters suddenly reappears after you hit maximum GSYNC rate.
User choice probably won out here, I would guess --
That said, if you're able to use an in-game framerate cap (usually the best kind!) then you ideally really don't need VSYNC ON/OFF when hitting the max rate. In-game framerate caps slightly below maximum VRR rate on either FreeSync/GSYNC is usually the best way -- we were the first website in the world to lag-test GSYNC and was the first to discover that in-game frame-capping below GSYNC max rate, was a way to avoid the sudden lag increase problem during hitting GSYNC limit.
User choice probably won out here, I would guess --
That said, if you're able to use an in-game framerate cap (usually the best kind!) then you ideally really don't need VSYNC ON/OFF when hitting the max rate. In-game framerate caps slightly below maximum VRR rate on either FreeSync/GSYNC is usually the best way -- we were the first website in the world to lag-test GSYNC and was the first to discover that in-game frame-capping below GSYNC max rate, was a way to avoid the sudden lag increase problem during hitting GSYNC limit.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter
Forum Rules wrote: 1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
2. Please report rule violations If you see a post that violates forum rules, then report the post.
3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!
- kurtextrem
- Posts: 41
- Joined: 05 Mar 2017, 03:35
- Location: Munich, Germany
Re: G-Sync 101 w/Chart (WIP)
imo your benchmarks show a (very tiny) difference between gsync + vsync off/on. If you can live with the tearing at the very bottom, why not take it?
On the bottom there's nothing too important, where tearing may influence your sights.
This is also why I use prerendered virtual reality frames: 1 (not sure if it even does something when not using VR stuff, but whatever), max prerendered frames: 1 and prefered refreshrate: highest available. Every frame counts, Fps stability is up to your system (close expensive programs while gaming, use Bitsums highest performance plan and so on)
On the bottom there's nothing too important, where tearing may influence your sights.
This is also why I use prerendered virtual reality frames: 1 (not sure if it even does something when not using VR stuff, but whatever), max prerendered frames: 1 and prefered refreshrate: highest available. Every frame counts, Fps stability is up to your system (close expensive programs while gaming, use Bitsums highest performance plan and so on)
Last edited by kurtextrem on 30 Mar 2017, 02:08, edited 1 time in total.
Acer XF250Q, R6 competitive player
Re: G-Sync 101 w/Chart (WIP)
There is an important factor. The lower the frame rate, the bigger the offset between frames. Which is why tearing is more jarring with low frame rates and less jarring with high frame rates.RealNC wrote:Why would you want g-sync in that case? Since the majority of the time you exceed the refresh rate and run vsync off, you get tearing. Why would you care about tearing for the case where your FPS falls below refresh? The only thing that changes with FPS>Hz vs FPS<Hz is the direction the tearline is moving. For example, assuming 100Hz, 110FPS has the same amount of tearing as 90FPS. The only difference is that in the 110FPS case the tearline is moving upwards, while in the 90FPS case it's moving downwards. Other than that, everything looks pretty much the same, so I have a hard time understanding why you would want the 90FPS case being synced and the 110FPS case being unsynced...Haste wrote:For whenever the frametime goes above 1/refresh rate.
Can (does) happen even in games that have 300+ average fps.
So you might not be bothered too much by tearing at 300fps but bothered at let's say ~100fps.
And then there is the matter of the way synchronization does ever so slightly reduce the judder/stutter perception even compared to vsync off. (It's nowhere near the dramatic improvement between gsync and vsync induced micro-stuttering, but its there)
Ah. That surely is a very good point. And that could easily overshadow the benefits I described above.RealNC wrote: Also, letting gsync activate when FPS<Hz results in a bit of input lag kicking in, which makes things less consistent. (G-Sync only adds *little* input lag, but not zero!)
I wasn't aware of that side effect of gsync.
I think that feature was also to counter AMD. Since Freesync had it out of the box.RealNC wrote: IMO, people who think G-Sync + vsync off is useful didn't think things through. Which I suspect is also why NVidia didn't initially even allow for that.
Monitor: Gigabyte M27Q X
Re: G-Sync 101 w/Chart (WIP)
That's not true, actually. The offset is limited to the height of the monitor, since there's only one tearline. The only thing that changes is the direction the tear is moving.Haste wrote:There is an important factor. The lower the frame rate, the bigger the offset between frames. Which is why tearing is more jarring with low frame rates and less jarring with high frame rates. So you might not be bothered too much by tearing at 300fps but bothered at let's say ~100fps.
To reduce the offset, you need multiple tearlines. That means you need FPS >= 2*Hz. For 144Hz, the offset only decreases when you reach 288FPS (2*144).
Steam • GitHub • Stack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
Re: G-Sync 101 w/Chart (WIP)
Pan a camera horizontally at 1000 pixels per second:
- At 100fps the offset between frame n and frame n+1 will be 10 pixels
- at 300fps the offset between frame n and frame n+1 will be 3.33... pixels
- At 100fps the offset between frame n and frame n+1 will be 10 pixels
- at 300fps the offset between frame n and frame n+1 will be 3.33... pixels
Monitor: Gigabyte M27Q X
Re: G-Sync 101 w/Chart (WIP)
Oops. I was having the distance between tearlines be the "offset" in my head. You're right, of course. The actual difference between the frames themselves gets higher as the frame rate gets lower.Haste wrote:Pan a camera horizontally at 1000 pixels per second:
- At 100fps the offset between frame n and frame n+1 will be 10 pixels
- at 300fps the offset between frame n and frame n+1 will be 3.33... pixels
I can see now why gsync on + vsync off makes sense to some people.
Steam • GitHub • Stack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.