Page 13 of 65

Re: Blur Buster's G-SYNC 101 Series Discussion

Posted: 25 Jun 2017, 15:08
by jorimt
RealNC wrote:An in-game limiter can know when a good render-start-time would be without actually having to render anything.
Correct, thus the closer to the source the better, as in-game limiters are at the level where the render time of a frame is actually calculated in the first place.
Chief Blur Buster wrote:Take it easy on Jorim. We need him to create additional content!
Thanks Chief, and don't worry, these were easy, single run spot tests to clear up lingering questions on the G-SYNC article concerning MPRF.

I'd like to delve deeper into this issue in the future, and maybe create a dedicated article, as RealNC brings up a good point; it doesn't get covered often, if at all.

This was merely the initial research that would ultimately facilitate the final testing for a possible eventual article, much like what I did with my original G-SYNC thread when I started this all late last year.

RealNC and Sparky clarified enough to give me the knowledge to test this further in a more official capacity, if/when necessary.

Re: Blur Buster's G-SYNC 101 Series Discussion

Posted: 25 Jun 2017, 15:13
by RealNC
Chief Blur Buster wrote:Already said before, but in another 'related' perspective, capped GSYNC/FreeSync is very forgiving while being low lag -- early/late frames have the potential to be displayed immediately anyway. With enough headroom (e.g. 138fps at 144Hz) there's plenty of margin for frametimes to be early/late and there's practically never such a thing as a "too late to be on time for VSYNC". In other words, frame-capped VRR is almost the ideal "low-lag VSYNC ON" experience.
It's also effective for plain old vsync off without VRR. I've been making the point on occasion to people. Playing OW for example, uncapped and with vsync off is not the best way to the least amount of latency. Of course I didn't have the data to back it up, so it sounded like complete bollocks to people. "Why would lowering my FPS reduce my latency?" It's natural to think that the higher your FPS, the less latency you get, thus you should never, ever cap your frame rate.

So yeah, some solid myth-busting there by Jorim :D

Re: Blur Buster's G-SYNC 101 Series Discussion

Posted: 25 Jun 2017, 15:20
by Chief Blur Buster
Oh, and for those running arcade emulators, MAME, Commodore 64, Apple, and other emulators...

Emulators might achieve less average input lag than the original 8-bit machines
Capped 60fps GSYNC/FreeSync (at 144Hz/240Hz to take advantage of fast scanout velocity) -- you have very low scanout lag & you have perfect frame sync with lots of headroom for jitters in frametimes without the penalty of missing a VSYNC.

With 240Hz GSYNC capped to 60fps for emulator, you can reduce emulator input lag from "33ms/50ms" (top edge/bottom edge lag) all the way down to "0ms/4.2ms" (top edge/bottom edge lag) -- over two refresh cycles less lag thanks to eliminating VSYNC ON buffers *and* using a 1/240sec scanout of a 240Hz GSYNC monitor (even at 60Hz). Although benefit depends on the specific emulator, this is quite a large latency improvement even for 60fps gaming.

In fact, thanks to the low-lag 1/240sec scanout of a 240Hz VRR display, you potentially have less bottom-edge latency than the original arcade machine since the original CRTs scanned out in 1/60sec. Assuming emulator "fast-executed" in bursts (unthrottled between refresh cycles) in order to prepare the frame soon enough to begin delivering it quickly enough (1/240sec scanout) to account for other lag overheads (e.g. LCD GtG). Done fast enough, average input lag can be matched or lower than the original machine.

Re: Blur Buster's G-SYNC 101 Series Discussion

Posted: 25 Jun 2017, 15:23
by Sparky
Chief Blur Buster wrote:
RealNC wrote:An in-game limiter can know when a good render-start-time would be without actually having to render anything.
Yep. Good predictive just-in-time rendering can do that. And the programmer has to do a good job of doing predictive rendering properly.
I think it's relatively easy to do lower latency vsync (going from several frames down to <1 additional frame, using a synchronous cap), but until recently it's been unpopular because it leaves some idle resources, which could be used to make the game look better in trailer/demos.

However, there is some more interest in low latency rendering due to VR: https://developer.oculus.com/blog/optim ... -latching/

Re: Blur Buster's G-SYNC 101 Series Discussion

Posted: 25 Jun 2017, 15:27
by jorimt
Chief Blur Buster wrote:With 240Hz GSYNC capped to 60fps for emulator, you can reduce emulator input lag from "33ms/50ms" (top edge/bottom edge lag) all the way down to "0ms/4.2ms" (top edge/bottom edge lag) -- over two refresh cycles less lag thanks to eliminating VSYNC ON buffers *and* using a 1/240sec scanout of a 240Hz GSYNC monitor (even at 60Hz). Although benefit depends on the specific emulator, this is quite a large latency improvement even for 60fps gaming.
Yup, another good point, and more material for more articles.

One thing I didn't have a chance to fully clarify about this in my article is the exact difference between 60 FPS @144Hz/240Hz on a G-SYNC display vs. 60 FPS @144Hz/240Hz on a fixed refresh rate display. Those not knowing the in-and-outs could easily be confused why both display types wouldn't have the same benefits in that instance.

Re: Blur Buster's G-SYNC 101 Series Discussion

Posted: 25 Jun 2017, 17:19
by akirru
I noticed something whilst playing Rising Storm: Vietnam... When I have gsync activated with nvcpl vsync my cpu usage is higher. In fact one of my cores is pegged at 100% a lot more of the time. This game is basically single threaded as it's based on an old engine. I have a 2500k overclocked to 4.6Ghz and a 970gtx. The game is much smoother without gsync. It stutters with gsync activated.

What would cause the higher cpu usage?

Re: Blur Buster's G-SYNC 101 Series Discussion

Posted: 25 Jun 2017, 18:11
by Glide
Chief Blur Buster wrote:Emulators might achieve less average input lag than the original 8-bit machines
Now if only I could get most emulators to actually work properly with G-Sync.
Most that I have tried so far end up skipping quite badly, similar to the DirectDraw problems I mentioned here.
Perhaps we should have a separate topic for discussion of what does/doesn't work.

Re: Blur Buster's G-SYNC 101 Series Discussion

Posted: 25 Jun 2017, 18:23
by jorimt
akirru wrote:I noticed something whilst playing Rising Storm: Vietnam... When I have gsync activated with nvcpl vsync my cpu usage is higher. In fact one of my cores is pegged at 100% a lot more of the time. This game is basically single threaded as it's based on an old engine. I have a 2500k overclocked to 4.6Ghz and a 970gtx. The game is much smoother without gsync. It stutters with gsync activated.

What would cause the higher cpu usage?
Not G-SYNC. In fact G-SYNC can't cause much of anything, let alone stutter. The issue must lie elsewhere. Do you have the FPS limited as recommended? Either way, there's not much information from your post to go on.
Glide wrote: Now if only I could get most emulators to actually work properly with G-Sync.
Most that I have tried so far end up skipping quite badly, similar to the DirectDraw problems I mentioned here.
Perhaps we should have a separate topic for discussion of what does/doesn't work.
I think the biggest issue with G-SYNC and emulators, short of the DirectDraw issue, is the conflicts with audio syncing. I've thought of experimenting with some of the more popular emulators and creating a "G-SYNC Emulator HOWTO" article at some point.

Re: Blur Buster's G-SYNC 101 Series Discussion

Posted: 26 Jun 2017, 14:51
by monitor_butt
jorimt wrote:
monitor_butt wrote: So I wouldn't need a 400hz screen to see the benefit or a lower ms/f latency when it exceeds my refresh rate as previously mentioned?
As stated here before, if you couldn't care less about tearing or microstutter, then just stick with V-SYNC OFF and super high framerates. This will ensure the lowest possible lag, but it will never be delivering a single frame in sync with the display's scanout.

And, no, even with V-SYNC OFF and high framerates, you're 144Hz display is still limited by its 6.9ms scanout speed. At 400Hz, the scanout speed would actually be much higher, which means frame delivery would be as well. At this speed, V-SYNC OFF would be delivering no more than two updates per scanout at 400 FPS to the display at 400Hz, which means both G-SYNC and V-SYNC OFF would have roughly the same delivery speed.
Wouldn't frames being delivered much faster make things look and feel more responsive? Maybe I'm crazy, but I feel input lag when I cap my frame rate at 144 fps. I also see much more blur when I do quick swipes around in-game. As soon as I uncap and my fps hits 300+, I instantly feel a more responsive input and a significant reduction in the blur when I swipe around. I also feel once you start breaking 350+ fps, you're getting so many frames per second, that it starts to hide a lot of the tearing you normally can see at lower frame rates.

Re: Blur Buster's G-SYNC 101 Series Discussion

Posted: 26 Jun 2017, 16:18
by jorimt
monitor_butt wrote:Wouldn't frames being delivered much faster make things look and feel more responsive? Maybe I'm crazy, but I feel input lag when I cap my frame rate at 144 fps.
Depends on how many frames you are sustaining above 144Hz at any given time. Again, as you might have already seen in the diagram I embedded earlier, V-SYNC OFF can scan in multiple partial frame updates in a single scanout. This can give slightly more responsiveness, as parts of frames are being delivered earlier.

Keep in mind, however, this means, unlike G-SYNC, with multiple updates being delivered per scanout, you're basically seeing multiple points of game time on-screen, which means variable input latency.

At 300 or so frames on a 144Hz display, we're only looking at an average 1-3ms reduction in input latency over FPS-limited G-SYNC, maybe a max of 2ms reduction from middle screen at any given point.
monitor_butt wrote:I also see much more blur when I do quick swipes around in-game.
This would be a placebo. Sustained framerates above the refresh rate cannot increase the physical motion clarity of the panel, it only provides frame updates more quickly, as mentioned above.
monitor_butt wrote:I also feel once you start breaking 350+ fps, you're getting so many frames per second, that it starts to hide a lot of the tearing you normally can see at lower frame rates.
Yes, actual tearlines become much less noticeable/near imperceivable at higher refresh rates + very high framerates above the refresh rate, but the tearing artifacts at this point can be perceived as microstutter/uneven frame pacing instead.