NVIDIA Fast Sync

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Re: NVIDIA Fast Sync

Post by RealNC » 27 May 2016, 16:46

Just for your information: Fast Sync is essentially the same thing as old-school Triple Buffering, except it caps frame rate to multiples of the refresh rate (in an attempt to avoid micro-stutter.)

So basically they're bringing back 3dfx's triple buffer feature. In contrast to that, however, now you need to have a framerate at least double that of your refresh rate to see any benefit, and four times for it to be actually good. If you get fluctuating frame rates, the experience is not so good, since input latency then starts to vary while you're playing.

Btw, Fast Sync works with Maxwell GPUs too (I tested with a 980 Ti), if you enable it in nvidia inspector.

Bottom line: if you play games where you can reach 300FPS, it feels nice. A bit of micro-stutter here and there, but input latency is great and there's zero tearing. If you play games that struggle to hit the 100FPS mark, it's not gonna be very good. So this is most useful for high framerate games like Quake and Counter-Strike. If you try it with Battlefield 4 or similar, then nah.

Also, even though this feature works with G-Sync just fine, and it avoids too much input lag when FPS goes above monitor refresh, it's still much better to just cap the framerate of your game to 5 or 6 FPS below the refresh rate instead of using G-Sync + Fast Sync.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: NVIDIA Fast Sync

Post by Sparky » 27 May 2016, 17:56

RealNC wrote:Just for your information: Fast Sync is essentially the same thing as old-school Triple Buffering, except it caps frame rate to multiples of the refresh rate (in an attempt to avoid micro-stutter.)

So basically they're bringing back 3dfx's triple buffer feature. In contrast to that, however, now you need to have a framerate at least double that of your refresh rate to see any benefit, and four times for it to be actually good. If you get fluctuating frame rates, the experience is not so good, since input latency then starts to vary while you're playing.

Btw, Fast Sync works with Maxwell GPUs too (I tested with a 980 Ti), if you enable it in nvidia inspector.

Bottom line: if you play games where you can reach 300FPS, it feels nice. A bit of micro-stutter here and there, but input latency is great and there's zero tearing. If you play games that struggle to hit the 100FPS mark, it's not gonna be very good. So this is most useful for high framerate games like Quake and Counter-Strike. If you try it with Battlefield 4 or similar, then nah.

Also, even though this feature works with G-Sync just fine, and it avoids too much input lag when FPS goes above monitor refresh, it's still much better to just cap the framerate of your game to 5 or 6 FPS below the refresh rate instead of using G-Sync + Fast Sync.
The main difference between normal triple buffering and fast sync is that fast sync drops frames instead of limiting framerate.

User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Re: NVIDIA Fast Sync

Post by RealNC » 27 May 2016, 21:25

Sparky wrote:The main difference between normal triple buffering and fast sync is that fast sync drops frames instead of limiting framerate.
Triple buffering dropped frames, just like fast sync. It's a rebranded triple buffer technique.

Note that it doesn't have anything to do with the modern term "triple buffer". That's just vsync with a queue of three buffers instead of two. The "old-school" triple buffering was used in OpenGL in the past, and in Glide (3dfx Voodoo cards.) Fast sync is pretty much that. Which is good. Real triple buffering was a good feature to have, until Microsoft removed it from DirectX. It's nice that NVidia now did their own implementation directly in the driver.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: NVIDIA Fast Sync

Post by Sparky » 28 May 2016, 01:29

RealNC wrote: Note that it doesn't have anything to do with the modern term "triple buffer". That's just vsync with a queue of three buffers instead of two. The "old-school" triple buffering was used in OpenGL in the past, and in Glide (3dfx Voodoo cards.) Fast sync is pretty much that. Which is good. Real triple buffering was a good feature to have, until Microsoft removed it from DirectX. It's nice that NVidia now did their own implementation directly in the driver.
There is a lot of misunderstanding here. With double buffered v-sync(and no sync), there's a front buffer and a back buffer. The front buffer is being scanned out to the monitor, and the back buffer is being rendered into. With v-sync on, when the gpu finishes a frame, it stops working until the next v_blank, with v-sync off, it flips buffers whenever it finishes a frame. The point of triple buffering is to add a second back buffer to give the GPU a place to start working on the next frame while the frame it just finished waits on the next v_blank(so your framerate doesn't drop to half your refresh rate whenever you dip below your refresh rate). Data doesn't move between the various framebuffers, it's just the pointers to them that change, All 3 buffers spend time as the "front" buffer and the "back" buffer. The only time triple buffering results in a queue is when the display is the bottleneck. There is a separate queue between the game engine and the GPU that fills up when you're GPU or display limited, this is called max prerendered frames, or sometimes called(misleadingly) flip queue.

Fast sync is identical to the above triple buffering except that instead of stopping rendering when both back buffers are full, it will overwrite the oldest back buffer and put the newer back buffer next in line for the display. This is effectively the same thing that happens when you use windowed mode instead of full screen exclusive mode.

I'm not aware of an implementation of triple buffering which drops frames while still being called triple buffering. Do you have documentation of either the 3dfx or old opengl one?

I really wish people would just say "triple buffering with frame dropping" instead of "real triple buffering". The latter only leads to confusion.

Trip
Posts: 157
Joined: 23 Apr 2014, 15:44

Re: NVIDIA Fast Sync

Post by Trip » 28 May 2016, 06:09

Sparky wrote:I really wish people would just say "triple buffering with frame dropping" instead of "real triple buffering". The latter only leads to confusion.
We could also just call it fast sync and triple buffering now. Anyway I like that it is an option now, ulmb + vsync feels a lot better now imo. At least when you can get high enough frame rates. Wish they could develop some algorithm like the one from ezquake where it will pick what frame it will render instead of rendering everything. Guess that will come in the next graphics cards generation to generate some more publicity for their new line :P.

Glide
Posts: 280
Joined: 24 Mar 2015, 20:33

Re: NVIDIA Fast Sync

Post by Glide » 28 May 2016, 07:19

Sparky wrote:Fast sync is identical to the above triple buffering except that instead of stopping rendering when both back buffers are full, it will overwrite the oldest back buffer and put the newer back buffer next in line for the display.
Which is a significant difference.
For years, people have been saying that triple buffering helps minimize latency when V-Sync is on, which is not correct.
Fast Sync operates the way a lot of people thought triple-buffering operated, based on that Anandtech article from seven years ago.

You're right that it behaves similarly to the way that borderless windowed mode does when V-Sync is disabled - though latency should be lower.

User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Re: NVIDIA Fast Sync

Post by RealNC » 28 May 2016, 07:30

Sparky wrote: I'm not aware of an implementation of triple buffering which drops frames while still being called triple buffering. Do you have documentation of either the 3dfx or old opengl one?
Here's an Anandtech article about:

http://www.anandtech.com/show/2794

At the end, they say:

"There has been a lot of discussion in the comments of the differences between the page flipping method we are discussing in this article and implementations of a render ahead queue. In render ahead, frames cannot be dropped. This means that when the queue is full, what is displayed can have a lot more lag. Microsoft doesn't implement triple buffering in DirectX, they implement render ahead (from 0 to 8 frames with 3 being the default)."
I really wish people would just say "triple buffering with frame dropping" instead of "real triple buffering". The latter only leads to confusion.
Actually, "triple buffering" was synonymous with nvidia's fast sync. I do not know when exactly it changed meaning to just "3 frames render-ahead queue." People got used to call that TB, but it's not.

So we made a few explanation in this discussion about what TB actually is, and my point is that NVidia really "just"
re-invented TB and named it Fast Sync.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

Glide
Posts: 280
Joined: 24 Mar 2015, 20:33

Re: NVIDIA Fast Sync

Post by Glide » 28 May 2016, 09:28

RealNC wrote:
Sparky wrote: I'm not aware of an implementation of triple buffering which drops frames while still being called triple buffering. Do you have documentation of either the 3dfx or old opengl one?
Here's an Anandtech article about: http://www.anandtech.com/show/2794
Which is the one that everyone cites as a source, but I don't believe is accurate.

I've never seen an OpenGL game flip between the buffers and render more frames than the refresh rate when triple buffering is enabled.

User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Re: NVIDIA Fast Sync

Post by RealNC » 28 May 2016, 12:19

Glide wrote:
RealNC wrote:
Sparky wrote: I'm not aware of an implementation of triple buffering which drops frames while still being called triple buffering. Do you have documentation of either the 3dfx or old opengl one?
Here's an Anandtech article about: http://www.anandtech.com/show/2794
Which is the one that everyone cites as a source, but I don't believe is accurate.

I've never seen an OpenGL game flip between the buffers and render more frames than the refresh rate when triple buffering is enabled.
With Voodoo 3 card back in 1999, "triple buffer" without vsync did exactly what fast sync does. The setting was for Glide and OpenGL. Direct 3D didn't have that.

Triple buffering with vsync still capped at the refresh rate though. The difference is that enabling triple buffering without vsync had no tearing. The game would render as fast as possible, and the driver would always pick the latest frame.

http://www.techspot.com/tweaks/voodoo3/5.gif

I don't know if it meant something different for OpenGL though. I think it might as well be just Glide that did this.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

Glide
Posts: 280
Joined: 24 Mar 2015, 20:33

Re: NVIDIA Fast Sync

Post by Glide » 28 May 2016, 16:12

RealNC wrote:With Voodoo 3 card back in 1999, "triple buffer" without vsync did exactly what fast sync does. The setting was for Glide and OpenGL. Direct 3D didn't have that.
Triple buffering with vsync still capped at the refresh rate though. The difference is that enabling triple buffering without vsync had no tearing. The game would render as fast as possible, and the driver would always pick the latest frame.
http://www.techspot.com/tweaks/voodoo3/5.gif
I don't know if it meant something different for OpenGL though. I think it might as well be just Glide that did this.
OK, well if that's true (I had a quick search and couldn't find anything mentioning this about its triple buffering behavior) do you agree that triple buffering has not worked this way in the past 17 years on non-Voodoo cards - whether it's Direct3D or OpenGL - and that Fast Sync is the return of "true" triple buffering which behaves that way?

-----

I got some more testing done today, and in racing games, once the framerate is above 180 FPS it seems to make a huge improvement to latency/responsiveness. It may depend on the game, but it also feels like physics and/or feedback is being polled at a higher rate when the framerate is uncapped in some of them.

But you really have to be sure that you can always keep the framerate above 180 FPS at all times, and I had to put a framerate cap in place to stop it hitting 240/300/360 FPS in less demanding scenes. When it jumps from one framerate tier to the next, it feels just as bad as when V-Sync drops from 60 FPS to 30.

Additionally, you should stick to regular V-Sync if you cannot keep the framerate above 120 FPS. I tried putting a framerate cap in place so it wouldn't even hit 120 FPS (at the same settings which gave me a constant 180 FPS) and when Fast Sync was being used I would encounter bouts of very bad stuttering despite the framerate appearing to be locked to 60 FPS at all times. Switched to regular V-Sync and it was 100% smooth.


One of the downsides to all of this is that it's made me want the latency reduction and improvement to gameplay (beyond the latency reduction) that running at a higher framerate brings. The problem is that I have to turn down the graphical settings a lot to achieve this.

I've gone from supersampling games with SGSSAA or DSR, to running at native 1080p again and turning settings down, in order to increase the framerate.

I've been trying to keep my 2500K system going for a few more years by updating the video card, but this is making me want to get a G-Sync display and build a new high-end rig now.

Post Reply