Overwatch -- Lightboost 120Hz + Fast Sync

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
User avatar
KevOW
Posts: 16
Joined: 23 Apr 2017, 14:46

Re: Overwatch -- Lightboost 120Hz + Fast Sync

Post by KevOW » 25 Apr 2017, 04:01

always played in fullscreen, no worries there

i just reverted back to whatever CRU had it set to ¯\_(ツ)_/¯

when I said smoother, i meant in clarity (not reduced input lag)

if I turn off vsync, i set my frames back to 240
No input lag pls

straussmanover
Posts: 6
Joined: 25 Apr 2017, 05:02

Re: Overwatch -- Lightboost 120Hz + Fast Sync

Post by straussmanover » 25 Apr 2017, 05:30

Finally, this is the godsend thread I've been looking for! I've found myself in this conundrum for about 2 months now. 120 ULMB with VSYNC On simply looks amazing, but my aim just feels so "floaty." Switch to uncapped VSYNC Off (ULMB On), the tearing isn't terrible, but its just enough for me to notice and have a distracting impact. So for the meantime, I've just settled on fixed 144 refresh VSYNC Off, although I've really been wanting to use ULMB.
FWIW, I'm running a GTX 970 and using a Acer Predator XB241YU. Being 1440p, I can't get huge framerates, even with low settings (RS 75%, FXAA, eye candy low etc.) I can get about 220 in the practice range, but in teamfights I'm usually averaging about 190, so I've been playing with my in game cap at 180. Also recently read the mouse refresh thread, I've already been using 1000hz so I'm good in that regard.

I'm looking forward to trying this "ultra low VSYNC" method tonight, I'll definitely follow up with my experience.
Just for clarification:
CRU cap ~120.007 (I've used CRU before so I'm familiar with this.)
Cap FPS in-game at 120
VSYNC- this is where I have a question, should I use the in-game VSYNC, or an external utility, like NVidia control panel?

I've been a long time lurker on the site and forums, but I finally made an account to post in this thread. Big ups to the founders and chiefs, you've got some really great stuff going on here! Cheers for what you do :mrgreen:

User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Re: Overwatch -- Lightboost 120Hz + Fast Sync

Post by RealNC » 25 Apr 2017, 06:23

straussmanover wrote:CRU cap ~120.007 (I've used CRU before so I'm familiar with this.)
That's not a cap. That's a refresh rate.

It might be better to go 120.010Hz if the Overwatch frame limiter isn't 100% accurate. Note that not all monitors are able to do ULMB with tweaked refresh rates though.
VSYNC- this is where I have a question, should I use the in-game VSYNC, or an external utility, like NVidia control panel?
Shouldn't matter. Try both and see if there's a difference.

It is VERY important to NOT use "triple buffering" in the in-game settings. This must be set to OFF. Contrary to popular belief (long story,) this INCREASES input lag, it does not reduce it. Note: I'm talking about the in-game "triple buffering" setting in Overwatch, NOT the nvidia control panel one (that one has no effect in this game.)
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

straussmanover
Posts: 6
Joined: 25 Apr 2017, 05:02

Re: Overwatch -- Lightboost 120Hz + Fast Sync

Post by straussmanover » 25 Apr 2017, 06:59

RealNC wrote:That's not a cap. That's a refresh rate.
Right. I should've caught that, I've got 2 other monitors overclocked using CRU :?
RealNC wrote:It is VERY important to NOT use "triple buffering" in the in-game settings. This must be set to OFF. Contrary to popular belief (long story,) this INCREASES input lag, it does not reduce it. Note: I'm talking about the in-game "triple buffering" setting in Overwatch, NOT the nvidia control panel one (that one has no effect in this game.)
So, I'm a little familiar with triple buffering. For my own curiosity could you tell me if I'm understanding it correctly?
It increases input lag because it is "rendering ahead" one additional frame compared to standard VSYNC? In VSYNC off, the display is always getting the latest frame, so if the GPU is in the middle of drawing, you get part of the old frame, and part of the new, which causes a tear. VSYNC makes the display wait for a full frame, which would cause lag in the downtime waiting for the full draw. Triple buffering makes the "render ahead" 2 frames instead of one, which would increase input lag, but potentially increase IQ and smoothness?

Thanks for the reply.

User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Re: Overwatch -- Lightboost 120Hz + Fast Sync

Post by RealNC » 25 Apr 2017, 07:45

straussmanover wrote:So, I'm a little familiar with triple buffering. For my own curiosity could you tell me if I'm understanding it correctly?
It increases input lag because it is "rendering ahead" one additional frame compared to standard VSYNC? In VSYNC off, the display is always getting the latest frame, so if the GPU is in the middle of drawing, you get part of the old frame, and part of the new, which causes a tear. VSYNC makes the display wait for a full frame, which would cause lag in the downtime waiting for the full draw. Triple buffering makes the "render ahead" 2 frames instead of one, which would increase input lag, but potentially increase IQ and smoothness?
Vsync makes the GPU wait, not the display (vsync is a GPU function, not a display function; monitors do not know what "vsync" is, they just draw a frame every X milliseconds and that's it.) The display never waits since it has a fixed refresh rate. At 120Hz, the display will output 120 times per second and will NEVER wait for anything or for anyone. Unless you're using g-sync or freesync. In that case, it's the display that waits, not the GPU (the reason why g-sync/freesync were invented is because having the monitor do the waiting instead of the GPU solves many issues that cannot be solved otherwise.)

The short story of triple vs double buffering is that double buffering can halve your FPS when using vsync and the game is not able to maintain a high frame rate. On 120Hz, if the game can't maintain 120FPS, you might get 60FPS instead, then 40FPS, then 30, etc.

Triple buffering gets rid of that problem at the cost of one more additional frame of input lag. Triple buffering has no use when not using vsync. It's only there to solve the vsync frame halving issue at the cost of more input lag.

If the game is able to maintain 120FPS, using double buffering provides less input lag when using vsync. So it's the best setting if your machine can maintain 120FPS.

Triple buffering is not "smoother" or anything like that. It just protects against vsync FPS halving at the cost of input lag. Nothing more, nothing less.

The long story is:

https://www.youtube.com/watch?v=KhLYYYvFp9A&t=1293

This video is about g-sync, but it has a good eplanation ot vsync double vs triple buffering with slides. If you're really interested as to how the frame buffer chain works from the GPU to the monitor, it's a nice video to watch.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Re: Overwatch -- Lightboost 120Hz + Fast Sync

Post by RealNC » 25 Apr 2017, 07:59

Btw, in the video I posted, the "vsync backpressure" mentioned by Petersen is the reason the "low latency vsync trick" works. It gets rid of that backpressure. Once you have no backpressure, your input lag is reduced to the natural latency of your current frame rate plus the latency of the frame limiter. For Overwatch at 120Hz, low latency vsync is going to give an input lag of 8.3ms plus the input lag of the frame limiter. The Overwatch frame limiter is fairly low-latency, so you get something like 10ms latency in total (didn't measure anything, it's an educated guess.)

Compared to normal vsync though without the trick, you're looking at input latencies of 20ms or 30ms. The vsync trick can really cut input latency to half or even a third. And it works with all games, not just Overwatch.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

straussmanover
Posts: 6
Joined: 25 Apr 2017, 05:02

Re: Overwatch -- Lightboost 120Hz + Fast Sync

Post by straussmanover » 25 Apr 2017, 08:09

RealNC wrote:Vsync makes the GPU wait, not the display (vsync is a GPU function, not a display function; monitors do not know what "vsync" is, they just draw a frame every X milliseconds and that's it.) The display never waits since it has a fixed refresh rate. At 120Hz, the display will output 120 times per second and will NEVER wait for anything or for anyone. Unless you're using g-sync or freesync. In that case, it's the display that waits, not the GPU (the reason why g-sync/freesync were invented is because having the monitor do the waiting instead of the GPU solves many issues that cannot be solved otherwise.)
Ah, ok. That makes much more sense now. Guess I had it backwards.
RealNC wrote:Btw, in the video I posted, the "vsync backpressure" mentioned by Petersen is the reason the "low latency vsync trick" works.
You read my mind, that's exactly what I was beginning to wonder.
Even with that question answered in text, I still plan on watching the video when I get a chance.

Thanks for the amazing info! :D

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Overwatch -- Lightboost 120Hz + Fast Sync

Post by Chief Blur Buster » 25 Apr 2017, 12:46

RealNC wrote:(vsync is a GPU function, not a display function; monitors do not know what "vsync" is, they just draw a frame every X milliseconds and that's it.)
Slightly offtopic....but one terminology clarification about the origins of "VSYNC", since this is Blur Busters, y'know. ;)

The precise meaning is somewhat different (but related) for both ends of the connection (computer versus display) but a form of VSYNC exists (originally analog) on both sides...
Chief Blur Buster wrote:All raster displays (even an antique 1939 television set) need to 'know' about VSYNC.

It's called Vertical Synchronization, which is part of the vertical blanking interval between refresh cycles -- it's a signal that monitors use to synchronize to the next refresh cycles. This has been long part of analog signals of yester year and TV broadcasts (this forces the CRT to move electron gun back to the top for next refresh cycle). An equivalent is still used in the digital era as a signal to begin monitoring for the next refresh cycle coming into the display over the cable.

Terminology-wise from the GPU side, VSYNC OFF simply means the GPU doesn't wait for the monitor refresh cycle to begin delivering the frame (it essentially splices into the current scanout of the current frame).

The monitor still needs the blanking interval, but the GPU no longer waits for it to begin delivering the frame. VSYNC is a constant in the video signal. From the computer/GPU side, the "VSYNC ON" versus "VSYNC OFF" setting really means "Wait on VSYNC". So "VSYNC OFF" simply translates to "Don't wait for VSYNC" (as a vertical synchronization signal is still part of what's transmitted on display cables -- DVI, HDMI, DP, etc).

You may have seen phrases such as "Vertical Sync", "Vertical Front Porch", "Horizontal Sync" in Custom Resolution Utilities (NVIDIA, AMD, ToastyX, etc) -- see Glossary about Custom Resolution Utilities.

Want to know what Vertical Sync, Front Porches, etc look like an a 1940s analog TV signal? Google Images of analog VSYNC.

Surprisingly, the ballpark encapsulation structure of a video signal has changed little since the 1930s -- when digitized, the porches were guard-pauses for the momentum of accelerating/deceleration an electron gun horizontally and vertically -- otherwise you got skewed edges of images. Today, the porches are still used to let chips have enough time to do things such as get-ready to capture or scanout the next row of pixels, or a signal to begin processing/buffering/whatever on a new refresh cycle. Although the structure is not nearly as necessary today, it's still used today! These still exist and "Instant Mode" scanout still depends on these in order display pixels at the lowest possible latency, displaying them directly to the display with minimum processing, right off the DP/DVI/HDMI/VGA/etc wire... And resolution changes on many signals are still being done by displays simply monitoring the sync signals then counting the number of scan lines (digital rows of pixels).

It's also possible (e.g. various upgraded signalling standards) to just command the display directly with a resolution ("I'm sending you a 1920x1080 signal"), but displays even in 2017 still can accept a dumb bitstream (Basic DVI/HDMI display bitstreams -- and even their DispalyPort equivalent encapsulations -- one-way digital equivalent of analog signal -- just straight digital equivalent of analog, no 2-way, no resolution signalling, DDC line severed) and figure out the resolution simply by monitoring Sync/Porches then counting the number of visible (active) pixels. If this pixel counting is done via a VGA signal, this is "Tracking/phasing" (analog adjustments), but not needed for a digital signal, however, the porches/sync are still used as a means by monitors of start/stop counting (both in the vertical and horizontal directions) to figure out the resolution of the signal. If the monitor has its own frame buffer, it monitors the sync/porches to begin capturing the correct pixel at the beginning of the frame for verticals (or beginning of row of pixels, for horizontals).

Software has co-opted sync naming from the hardware signal, "VSYNC ON" / "VSYNC OFF", but its etymology is still the video signal itself and display technology itself that requires synchronization signals to tell a display to begin a refresh cycle (vertical blanking), or to begin a new scan line (horizontal blanking). Originally from analog era, these synchronization signals are still is still useful synchronization data in digital era (seen in Custom Resolution Utilities) -- and even modern LCD displays still scanout-in-realtime top-to-bottom on a 240Hz monitor (under high speed video) much like a distant digital equivalent like a 1939 television set. Even a 1940s TV broadcast had a Front Porch and Back Porch in their signals (both horizontal and vertical), vestigal remnants still remain today in the digital era, only made visible in Custom Resolution Utilities. The porches were guard bands between the visible image and the synchronization pulses, otherwise you sometimes had artifacts such as distorted edges of images or stray electron gun spray (e.g. still in the process of turning on / process of turning off when being moved to beginning of refresh, or beginning of scan line). You still see these stuff like "Horizontal Front Porch", etc.

If you hear of terminology "Reduced Blanking Intervals" / "Large Vertical Totals" -- this affects this too. The size of the vertical blanking interval is manipulated smaller/bigger in these cases. (and often also horizontal blanking interval -- normally to tell CRT to move electron gun to back to the left edge for next scan line, but digital displays need far less time to get ready to begin painting the next row of pixels...)
Slightly offtopic I know. Just covering all the bases.

Other than that, good advice!
The backpressure knowledge is important in achieving ultra-low-latency VSYNC ON.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

straussmanover
Posts: 6
Joined: 25 Apr 2017, 05:02

Re: Overwatch -- Lightboost 120Hz + Fast Sync

Post by straussmanover » 26 Apr 2017, 03:23

RealNC wrote:Btw, in the video I posted, the "vsync backpressure" mentioned by Petersen is the reason the "low latency vsync trick" works. It gets rid of that backpressure. Once you have no backpressure, your input lag is reduced to the natural latency of your current frame rate plus the latency of the frame limiter. For Overwatch at 120Hz, low latency vsync is going to give an input lag of 8.3ms plus the input lag of the frame limiter. The Overwatch frame limiter is fairly low-latency, so you get something like 10ms latency in total (didn't measure anything, it's an educated guess.)
I watched the video, really enjoyed it. It made VSYNC way easier to understand than anything I've read online. I don't remember him mentioning the term "vsync backpressure" but, I could have just missed it. I think I understand though... (hopefully?) In a very extreme case, say playing a game your GPU can run at 300fps, but using VSYNC ON with a 30hz monitor. The GPU renders the frame in ~3ms, but since VSYNC is on, it can't render the next fame until the scan, leaving it with ~30ms "idle" time (just generalizing there, I don't know enough about GPUs to know what is actually happening there.) So the trick works and looks good because the standard VSYNC process is in place, (GPU renders->waits for scan->renders next frame) except the waiting period is so small, it's effectively zero. Additionally, you should never have stuttering so long as your GPU can maintain that constant framerate.

straussmanover
Posts: 6
Joined: 25 Apr 2017, 05:02

Re: Overwatch -- Lightboost 120Hz + Fast Sync

Post by straussmanover » 26 Apr 2017, 04:08

Tried the trick out last night, seemed to work pretty well. Opened up CRU and looked in the extention block, all of the refresh rates were slightly below "advertised" 119.990, 143.990 etc. (forgot to screenshot.) Bumped it up to 120.010, and everything seemed to work well. Messed around in the training range for about an hour swapping between VSYNC ON and OFF to try and detect a difference. I even tried "blind" tests in attempt to rule out placebo (closing my eyes and clicking the VSYNC toggle a bunch, so I was unaware of what the current setting was.) I wasn't able to tell either way.

Sidenote with CRU: For the sake of messing around, I tried out a few more "aggressive" changes to the refresh setting, 115, 118, 122, 125. So after these changes and a driver restart (even a machine restart), I pulled up some UFO tests, and it was always reading 120hz, no matter what I had changed the refresh to, and ULMB was undoubtedly still active. So, this has me a bit concerned that the change from 119.990 to 120.010 was having no impact. (Perhaps the monitor or driver has some sort of "deviation limit?" Not allowing the rate to be changed too far from its intended speed?)

The problem is that I don't have tools to actually measure the input lag. I did some various "perceived/noticeable" input lag tests in Overwatch to test my tolerance. The difference between VSYNC ON and OFF at 60hz was certainly noticeable, however at 120hz and above, not so much... (I tested it before I applied the tweak.)
So at the end of the day, I'm stuck between two schools of thought. If I can't perceive the input lag difference between VSYNC ON and OFF, does it REALLY matter which one I choose? On other side though, I'd always like to have the lowest input lag, regardless of my perception.

Post Reply