It would be great to have a dynamic VSYNC OFF/ON mode -- frame-specific.
Basically if raster is near the bottom (e.g. RasterStatus.ScanLine API) then mimic VSYNC ON by waiting 0.05ms (less than 1ms).
The closest the tearline is to the bottom, the less latency VSYNC ON (for that particular frame) would have.
LCD displays refresh from top to bottom (high speed video
), and display data is transmitted from graphics card to monitor, one row of pixels at a time, in a top-to-bottom manner. The timing where the tearline occurs, determines how much latency saved relative to double-buffered (minimum possible queue depth) VSYNC ON.
A refresh cycle for GSYNC with a 144Hz cap, is 6.9ms from top edge to bottom edge.
--> So if a tearline occurs at the top edge, that's likely VSYNC OFF >6ms saved relative to VSYNC ON.
--> But if the tearline occurs near the bottom edge, that's VSYNC OFF saving less than 1ms relative to VSYNC ON.
So what could in theory happen, is tearing would only occur further above the bottom edge of the screen.
I'd think this could be a latency-optimized modified version of "Adaptive VSYNC". Turns VSYNC ON whenever the raster is near bottom, but turns VSYNC OFF when the raster is far away from bottom. This would cause the bottom-edge tearing to completely disappear, while tearing will still (rarely) occur throughout the screen at other times.
This could be a suggestion to NVIDIA, as an "Optimized VSYNC OFF" setting.