Acer XB271HU
I'm not using capture, linked footage is a phone recording. Go to 0:09 and press ">" to view video frame by frame.
UPD: actually, tearing starts even at 0:05
Acer XB271HU
I'm not using capture, linked footage is a phone recording. Go to 0:09 and press ">" to view video frame by frame.
Th IPS version? Because that's my primary, so I'm well familiar with its characteristics and G-SYNC behavior. I can't remember, but I don't think you ever shared your specs, which can make a difference for how severe (or not severe) tearing is in this scenario.
Yeah, I spaced out, I see it's a camera video now. Still, that tearing (as depicted in the video) is extremely difficult to spot in the video. Is that the RTSS scanline sync test pattern flashing on the left?
It doesn't matter. The RTSS frametime graph basically only tracks game/simulation/present time, not actually what ultimately ends up on the display (you'd need the likes of FCAT to track for that), so it could read as perfectly flat, and there could still be large enough frametime spikes (no matter how minor or brief) that could trigger a full tear like that with G-SYNC on + V-SYNC off within the G-SYNC range.
Wait, why should I enable V-SYNC with G-SYNC again? And why am I still seeing tearing with G-SYNC enabled and V-SYNC disabled? Isn’t G-SYNC suppose to fix that?
The answer is frametime variances.
“Frametime” denotes how long a single frame takes to render. “Framerate” is the totaled average of each frame’s render time within a one second period.
At 144Hz, a single frame takes 6.9ms to display (the number of which depends on the max refresh rate of the display, see here), so if the framerate is 144 per second, then the average frametime of 144 FPS is 6.9ms per frame.
In reality, however, frametime from frame to frame varies, so just because an average framerate of 144 per second has an average frametime of 6.9ms per frame, doesn’t mean all 144 of those frames in each second amount to an exact 6.9ms per; one frame could render in 10ms, the next could render in 6ms, but at the end of each second, enough will hit the 6.9ms render target to average 144 FPS per.
So what happens when just one of those 144 frames renders in, say, 6.8ms (146 FPS average) instead of 6.9ms (144 FPS average) at 144Hz? The affected frame becomes ready too early, and begins to scan itself into the current “scanout” cycle (the process that physically draws each frame, pixel by pixel, left to right, top to bottom on-screen) before the previous frame has a chance to fully display (a.k.a. tearing).
G-SYNC + V-SYNC “Off” allows these instances to occur, even within the G-SYNC range, whereas G-SYNC + V-SYNC “On” (what I call “frametime compensation” in this article) allows the module (with average framerates within the G-SYNC range) to time delivery of the affected frames to the start of the next scanout cycle, which lets the previous frame finish in the existing cycle, and thus prevents tearing in all instances.
And since G-SYNC + V-SYNC “On” only holds onto the affected frames for whatever time it takes the previous frame to complete its display, virtually no input lag is added; the only input lag advantage G-SYNC + V-SYNC “Off” has over G-SYNC + V-SYNC “On” is literally the tearing seen, nothing more.
For further explanations on this subject see part 1 “Control Panel,” part 4 “Range,” and part 6 “G-SYNC vs. V-SYNC OFF w/FPS Limit” of this article
Yes.
- Intel Core i7-8700K
Yes, it is the frame color indicator in RTSS.
Unfortunately not.
I see. So in my case it means that spikes happen quite often and they derive from ideal frametime by 1-2ms. When I used scanline sync in the same game, it was able to make tearline variance very minuscule (when I moved tearline into VBI, it never got out of there). Why can't regular frame limiter get the same result in this case?jorimt wrote: ↑10 Jul 2020, 15:56It doesn't matter. The RTSS frametime graph basically only tracks game/simulation/present time, not actually what ultimately ends up on the display (you'd need the likes of FCAT to track for that), so it could read as perfectly flat, and there could still be large enough frametime spikes (no matter how minor or brief) that could trigger a full tear like that with G-SYNC on + V-SYNC off within the G-SYNC range.
120 FPS @144Hz w/G-SYNC on + V-SYNC off is only tear-free when frametime spikes aren't occurring.
You're fine there, especially for CS:GO then.
That's because scanline sync adheres to the VBLANK (this is what you steer the tearline into) + uses an RTSS FPS Limit, whereas other FPS limiters don't have a VBLANK component, and if they did, they'd be scanline sync or a syncing method again.hmukos wrote: ↑10 Jul 2020, 16:43I see. So in my case it means that spikes happen quite often and they derive from ideal frametime by 1-2ms. When I used scanline sync in the same game, it was able to make tearline variance very minuscule (when I moved tearline into VBI, it never got out of there). Why can't regular frame limiter get the same result in this case?
Turns out I forgot to enable XMP after updating BIOS. Now I set RAM frequency to 3000MHz. I can't tell for sure but it appears that there are much less tearlines at 120fps after enabling XMP (but they still persist).
So this 1-2ms fluctuating behaviour of regular RTSS limiter is normal then? I am using G-SYNC + V-SYNC ON normally but I thought that if it fluctuates so much with V-SYNC OFF, than there would be uneven framepacing with V-SYNC ON. But actually you are saying that G-SYNC + V-SYNC ON will give the same even frametimes as Scanline Sync, right?jorimt wrote: ↑10 Jul 2020, 16:58That's because scanline sync adheres to the VBLANK (this is what you steer the tearline into) + uses an RTSS FPS Limit, whereas other FPS limiters don't have a VBLANK component, and if they did, they'd be scanline sync or a syncing method again.
G-SYNC + V-SYNC is effectively doing the same thing as scanline sync (adhering to the VBLANK to 100% prevent tearing), but it's doing it better and at any framerate within the refresh rate.
No software on the system side is perfectly stable, as the system itself isn't. RTSS can set a render target for the game, but it's up to the system to comply, and ultimately, there is overshoot here and there. Not the fault of RTSS, just a limitation of system stability (and, to a point, physics).
G-SYNC + V-SYNC, as I explained in that entry of my closing FAQ, compensates for those fluctuations, that's why I call it "frametime compensation" in my article. Within the G-SYNC range, the V-SYNC "option" (it's not actually V-SYNC in this instance) is part of G-SYNC; when you disable it, you're removing G-SYNC functionality.hmukos wrote: ↑10 Jul 2020, 19:24I am using G-SYNC + V-SYNC ON normally but I thought that if it fluctuates so much with V-SYNC OFF, than there would be uneven framepacing with V-SYNC ON. But actually you are saying that G-SYNC + V-SYNC ON will give the same even frametimes as Scanline Sync, right?
Model?moep wrote: ↑03 Aug 2020, 08:09How normal is it to have flickering outside of the G-Sync range on G-Sync compatible monitors?
I just went from an old monitor with a G-Sync module to one that's just compatible. The former had 0 flickering over 5+ years, the latter flickers like crazy outside of the VRR range (VA).