G-Sync's 1ms Polling Rate: My Findings & Questions

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag.

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Postby RealNC » 18 Mar 2017, 13:07

I'm getting bottom tearing with a single 980 Ti way before I hit the refresh cap. So I don't think it has to do with SLI.

GSync really prefers vsync on rather than off.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
User avatar
RealNC
 
Posts: 848
Joined: 24 Dec 2013, 18:32

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Postby Chief Blur Buster » 18 Mar 2017, 14:03

It would be great to have a dynamic VSYNC OFF/ON mode -- frame-specific.

Basically if raster is near the bottom (e.g. RasterStatus.ScanLine API) then mimic VSYNC ON by waiting 0.05ms (less than 1ms).

The closest the tearline is to the bottom, the less latency VSYNC ON (for that particular frame) would have.

LCD displays refresh from top to bottom (high speed video), and display data is transmitted from graphics card to monitor, one row of pixels at a time, in a top-to-bottom manner. The timing where the tearline occurs, determines how much latency saved relative to double-buffered (minimum possible queue depth) VSYNC ON.

A refresh cycle for GSYNC with a 144Hz cap, is 6.9ms from top edge to bottom edge.
--> So if a tearline occurs at the top edge, that's likely VSYNC OFF >6ms saved relative to VSYNC ON.
--> But if the tearline occurs near the bottom edge, that's VSYNC OFF saving less than 1ms relative to VSYNC ON.

So what could in theory happen, is tearing would only occur further above the bottom edge of the screen.

I'd think this could be a latency-optimized modified version of "Adaptive VSYNC". Turns VSYNC ON whenever the raster is near bottom, but turns VSYNC OFF when the raster is far away from bottom. This would cause the bottom-edge tearing to completely disappear, while tearing will still (rarely) occur throughout the screen at other times.

This could be a suggestion to NVIDIA, as an "Optimized VSYNC OFF" setting.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter!
User avatar
Chief Blur Buster
Site Admin
 
Posts: 3005
Joined: 05 Dec 2013, 15:44

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Postby mminedune » 19 Mar 2017, 11:45

RealNC wrote:I'm getting bottom tearing with a single 980 Ti way before I hit the refresh cap. So I don't think it has to do with SLI.

GSync really prefers vsync on rather than off.

Idk what to tell you I've been dealing with this forever now some games worst than other bf1 being the worst. Like when zooming in and out of map when spawning.

Now no tearing what so ever regardless my frames with vsync off.

Btw this is my thread

https://forums.geforce.com/default/topi ... king-100-/

If you look at my last vid i t got tearing in doom with SLI disabled. IDK is disabling SLI is same as just one GPU installed gpus installed. But I re ran doom with my 1080 ti no tearing at all.
mminedune
 
Posts: 28
Joined: 01 Feb 2014, 08:12

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Postby Sparky » 19 Mar 2017, 12:51

While I haven't heard any official confirmation, I believe the tearing while using gsync with vsync off and a framerate cap is caused by inconsistent frame times, where a single frame hitch can cause subsequent frame(s) to come in faster than the set framerate cap.

For example, you set a 100fps cap with a 144hz display. Normally you get one frame per 10ms, but say one frame hitches and takes 15ms. This gets displayed just fine, but now your very next frame comes in 5ms later. This causes a tear, because the refresh interval is 7ms. If you leave vsync on, that 5ms frame will get delayed by 2 ms, and you avoid the tear, with the third frame displayed on time.

It makes sense that SLI would make that type of tearing more common, especially with AFR. (overall frametime consistency, pipeline depth, and a delay on one frame won't delay rendering on the next frame).
Sparky
 
Posts: 455
Joined: 15 Jan 2014, 02:29

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Postby mminedune » 19 Mar 2017, 14:27

Battlefield 4 and more so 1 are prob worst games when it comes to frame times and it's fine now for me.

Also forgot to mention also increasingr dsr or any ingame downsamplet causes issues with gsync in SLI. With vsync off you get tearing vsync on can cause stutter.

single gpu with vsync off and frame cap i can downsample just fine.
mminedune
 
Posts: 28
Joined: 01 Feb 2014, 08:12

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Postby RealNC » 19 Mar 2017, 16:28

It doesn't make sense to use gsync+vsync off to begin with, so it's a non-issue really.

If you want vsync off, just do that. There's no reason to use gsync. The only reason to use gsync is because you don't like vsync off.

Gsync + vsync on + frame cap. Just use that.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
User avatar
RealNC
 
Posts: 848
Joined: 24 Dec 2013, 18:32

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Postby mminedune » 20 Mar 2017, 16:56

I never said i ran gsync+vsync off./ gsync on vsync off and frame cap... there is not reason to run vsync with gsync it does add slight lag and stutter/hitching when SLI frame times are off.

Ive been dealing with this since i got gsync couple years ago when i had 980 SLI.

Gsync sucks with SLI period.
mminedune
 
Posts: 28
Joined: 01 Feb 2014, 08:12

Previous

Return to G-SYNC

Who is online

Users browsing this forum: No registered users and 5 guests