1. V-Sync On, FPS cap set to just below refresh rate (i.e, 120.012Hz refresh rate, 120fps cap)
2. V-Sync On, FPS cap set to 1 or 2 frames per second below refresh rate (i.e, 120Hz refresh rate, 118 or 119fps cap)
3. V-Sync On, G-Sync On, FPS cap set to 118 (and then 119) with 120Hz refresh rate
4. Fast-Sync On, FPS uncapped at 300, Automatically limited by Fast-Sync to 240-260 fps (with no FPS drops) at a 120Hz refresh rate.
5. V-Sync On, FPS cap set to max 300 (so you get the full V-Sync lag effect)
6. Lightboost ON.
My answers are consistent with Jorimt, "with a caveat of It's more complex than that". Remember, my answers to #1/2/3/4/5/6 are different for screen top-edge versus screen centre versus screen-bottom, thanks to
scanout latency mechanics and
lag gradients. But I'm going to aim at "average" latency for simplicity's sake.
1. In ideal situation, 1/2 frame delay (average screen-centre latency)
2. In ideal situation, 1/2 frame delay (average screen-centre latency)
3. In proper game engines (input read right before render & deliver), virtually no average lag difference. '
However min/avg/max lag will jitter a lot more for VSYNC OFF (= more stutteringt) with min lag for screen center / screen bottom will be less than GSYNC for places right below tearlines. Tearslices are lag gradients, and the top edge of a tearslice will be extremely low.
4. In an ideal situation, 1/2 frame delay (average screen-centre latency)
5. Varies hugely. I've seen VSYNC ON implementations with as little as 1-2 frames delay, but usually it's a god-awful 3 frames of delay.
6. Proper strobing adds average half refresh cycle latency delay, but can vary by strobe phase adjustment.
1-2ms delay for top edge, half refresh cycle delay for center, and full refresh cycle delay for screen bottom.
(But why old-fashioned LightBoost? Why not super-bright, super-beautiful strobing such as 144Hz ULMB on a 240Hz monitor. Blindingly bright and colorful in comparision to dim LightBoost)
Adding other angles to the FastSync vs G-SYNC situation:
Input lag is a very complex topic with lots of subtopics that go far, far beyond scope of discussion.
I should add another angle, that might be affecting things.
(1) Absolute lag (overall average lag feel)
(2) versus lag jitter (the min/max/avg spread)
(3) versus lag jitter regularity/erraticness (how the values fluctuate between min/max/avg)
All three different things, all of which affects aiming.
It's possible to make it easier to do a headshot with more input lag but less randomized lag jitter.
--> Who wants to play at 25ms lag with ±20ms of lag jitter (5ms-45ms erratic random lag).
--> It's easier to do a headshot at 50ms lag with ±5ms of lag jitter (45-55ms erratic random lag).
Also, all of these below have different aiming difficulties.
--> 25ms evenly-random lag with ±10ms of lag jitter
--> 25ms patterned-random lag with ±10ms of lag jitter
--> 25ms regular cyclic lag with ±10ms of lag jitter
--> 25ms of lag that is exact 10% of the time, random 50% of the time, with ±20ms of lag jitter
All the above can have identical min/max/avg of 15/25/35ms!
But very different aiming difficulties!
Visible lag-jitter patterns often only show up when doing thousands of measurements (rather than 10 or 40 measurements), basically measuring the exact lag of every single frame, not just lag of each event.
Remember, we're talking about fast mouse-movement (e.g. 8000 pixels/second) where 1ms lag random variance = 8 pixel mouse undershoot or overshoot -- so lag randomness (how often or regularly patterned the lag variability is) greatly affects aiming reliability. Flicking fast and reliably stopping the mouse exactly on target, rather than undershootin/overshooting = less time moving mouse again to minor-adjust to spot-on headshot.
Today, lag erraticness error is still often in the tens-milliseconds timescales even in eSports, but this is a poorly tested and controlled topic. Many eSports players attuned to CS:GO or Quake Live "ultra-low-lag-jitter" can easily get get frustrated at bad aiming in slower FPS games such as Battlefield 4 which has a lot more lag jitter than CS:GO.
Lag jitter might be regular (e.g. sawtooth, or a smooth ramp, etc) or erratic (e.g. regular random, erratic switching between regular & random, etc). High speed video tests or single-reading-measurements don't catch such subtle things, and single-event lag measurements (button lag measurements). But things like RTSS and/or photodiode oscilloscopes can do this under certain cases -- advanced lag-measuring / lag-extrapolating methods that measures or calculates the lag (or lag differentials) of
every single frame, rather than just lag of single events (button presses). Those advanced lag measurement methods catches very detailed lag analyses that are revelatory to how easy/hard it is to aim -- erratic versus regular lag makes a huge difference in aiming accuracy. Converting random lag into cyclic lag or vice-versa can be beneficial (certain modes like Fast Sync might do that) from an aiming perspective.
We do not have that problem with CS:GO but it was common back in the Olden Days. We're used to precise lag behavior, but get frustrated by erratic lag -- imagine a very slow shotgun that randomly fires between 1 to 3 seconds after pressing the trigger (example: Trying to run Quake on an old 386). You don't know when the shotgun is gong to go "boom" if the lag is THAT random. On very, very, very slow systems (Quake on a 386) the shotgun sometimes feels like an erratic BFG900 on faster systems -- muzzle flash coming random seconds after fire button. Or if you started a big battle in Diablo 3 or on a really slow older system at its minimum computer requirements, then it suddenly churns at only 2 frames per second as it tries to access the hard disk suddenly, and your fire button erratically reacts 2-3 seconds later. Today, lag erraticness it's still a problem -- some game engines are so bad with random lag effects -- you'll get random lag effects in some games that are much worse than CS:GO. Even if it's down to hundreds-millisecond timescales, not the seconds-randomness timescales.
So.....I think lag jitter is a big effect beyond the scope of this thread and beyond the scope of GSYNC101. It often requires very advanced measurement techniques that most sites and gamers don't do (not even us). But you can bank on it: We'll be researching lag jitter (lag variability) more in the future, as it is an area of science very deserving of study.
Even beat-frequency-effects between mouse poll rate, frame rate, refresh rate, engine tick rate, etc, etc, all fight each other in creating amplified/reduced lag jitter/variability and what most people never consider -- is how the variability manifests itself. Predictable variability versus random variability makes a difference too.
This is beyond the scope of this thread, but, what's important is that this discussion may not consider the fact that Fast Sync may be benefitting from certain things that are beyond high-speed-video testing techniques.
It's possible to have less average lag but a wider min/max/avg spread.
And higher average lag but narrower min/max/avg spread.
Sometimes they're mutually exclusive. Slower systems (especially with lots of power management / thermal management / etc) will often have more lag jittering than desktop systems. Lag jitter (regularity & randomness) can also be impacted by better or poorer drivers, GPU brand, heavy power management overhead (forcing AC mode + Performance + lots of cooling can reduce lag jitter), slower CPU-versus-faster GPU, faster CPU-versus-slower GPU, etc.
For certain kinds of games, less lag jitter is more important than absolute lag. During racing and flight simulators, we often tolerate a lag in throttle/gas/steering response; we're used to it. Laggier system simply feels like a laggier car/plane which actually happens in real life, and we often adapt/adjust. And if we're not competing against online players, the absolute lag isn't as important as low-lag-jitter (erratic-lag steering wheel is "bad", predictable-lag steering wheel is "acceptable"). Depends on the game -- and whether or not we are competing against other humans online (or simply playing solo).
For non-GSYNC monitors, Fast Sync can improve experiences at frame rates far higher than refresh rates especially if you're trying to have a different lag-jitter behavior than VSYNC OFF. Also, whenever using ULMB -- I do notice that Fast Sync feels less microstuttery than VSYNC OFF when framerates far exceed refresh rate (e.g. 300fps @ 120Hz). So Fast Sync can be a lag-reducing compromise for improving strobed motion.
Jorimt is correct that Fast Sync is a brute-force way. It is simply the GPU rendering as rapidly as possible, and delivering only the freshest frame to the monitor. It certainly produces different lag jitter mechanics (which may or may not be better than VSYNC OFF for some people). Lag jitter and microstutter is dependant on framerate, so Fast Sync sometimes shines well at frame rates far higher than refresh rate -- in certain cases.
The great thing is Fast Sync is 100% compatible with ULMB, even though Fast Sync is indeed an inefficient brute-force technique. And if you're stuck with a choice of 300fps@240Hz VSYNC OFF versus 300fps@240Hz Fast Sync, there are some situations where Fast Sync looks preferable. Fast Sync is also useful as a lower-lag LightBoost/ULMB mode, since once it is given an overkill framerate (>300fps@120Hz), the microstutters greatly diminish albiet not gone, and it often looks better than VSYNC OFF during strobed modes since the microstuttering is almost the same (when at sufficiently overkill margins) yet the tearing is gone. ULMB/LightBoost does amplify the visibility of artifacts -- tearlines are easier to see without motion blur -- so Fast Sync becomes useful here too.
However, when not using ULMB (which you can't easily do with GSYNC on most monitors) -- when having a choice of GSYNC and Fast Sync, I've always found GSYNC to universally have the best aiming and "lower-lag-than-FastSync" experience (at frame rates less than refresh rate). Things do diverge when you can hit framerates far above refresh rate -- this is where VSYNC OFF and Fast Sync can help.
That said, even without ULMB or GSYNC:
Lag-jitter harmonic behavior and microstutter harmonic behavior are slightly different-behaving for VSYNC OFF versus Fast Sync, and aiming may even be easier with Fast Sync depending on user's vision.
-- 300fps@240Hz FastSync is a single full-frame-drop 1 out of 6 frames
240 frames visible, all refresh cycles are always complete frames
-- 300fps@240Hz VSYNC OFF gives you 300 "5/6th-screen-height" tearslices per second
You always have a portion of all 300 frames visible, as randomly placed tearslices, many of them overlapping two refresh cycles
For framerates lower than monitor max Hz, GSYNC tended fairly consistently lower lag than Fast Sync (when done properly!).
However, thing change when you go framerates far above refresh rate.... (like for our VSYNC OFF tests). 1000fps FastSync will still have less lag than say, 142fps@144Hz GSYNC for example. MIcrostutter do gradually diminishes with FastSync, the far higher the frame rate you go above refresh rate.
The big question is -- for some people, is capping to fps_max 238 at 240Hz GSYNC better -- or having access to Fast Sync 300fps at 240Hz?
Mathematically, GSYNC has the better numbers, but in real life, you might randomly have a fresher "button to pixels" frame during 300fps@240Hz. However, not all engines are created equal, and their input-read behaviors don't play very well with frame capping behaviors -- you might run into a game engine with an occasionally tighter framebuffer workflow (even if only 1 out of 10 times) during 300fps@240Hz Fast Sync, than the consistent framebuffer workflow that occurs with 238fps@240Hz GSYNC. I haven't tested for that scenario, but I understand programming, to see that edge cases like these might happen (....just covering bases here...). Sometimes the numbers lines up to the point where it may occur that consistent lag harmonics can mean easier aiming through bigger stutter. This is almost impossible to reliably test for, and the best thing that can be done is staring at graph data to analyze for patterns. (RTSS is one that gives a faint glimpse of the lurking complexity, but even that is woefully incomplete to capture the entire scope of this post).
At the end of the day, root causes of gaming-performance improvements can be far beyond the scope of the original post of this thread (and far beyond the scope of GSYNC 101). Especially when it comes to frame-by-frame lag and how the frame-by-frame lag varies (regularly or randomly), and how it all affects (improves/hurts) aiming. Confirmed edge cases happen that can definitely create better aiming at higher lag, or worse aiming at lower lag, etc.
There is a lot of science to study in input lag, and people have created full time careers over just the analysis of response/lag.
And hey, this is a tiny glimpse of the complexity of topic that input lag is! In this post, I have not even gone into topics such as
"input lag gradients" of tear slices and stuff like that. We now have come up with a simplified
filmreel metaphor diagram format that explains this much more easily to end users, and we will write more articles that explain the relationship of scanout latency, tearslice latency, lag-gradients of tearslices, and refresh cycle latency, and even accounts for the the blanking interval between refresh cycles (our trademark all-in-one filmreel metaphor). The time ruler at the right edge, also is a lag-gradient ruler too, for understanding the difference of input lag between top edge of a tearslice and bottom edge of a tearslice.
The Lag Gradient Effect of Frame Slices during VSYNC OFF
The Lag Gradient Effect of Display Scanout
And there's One More Thing. Lag Symmetry Complexity!
The symmetry between cable scanout versus display scanout. The way the pixels are transmitted over the cable (and VSYNC OFF takes advantage of that), versus the way the pixels are refreshed onto the screen. Scan on cable-vs-panel is not always symmetric, especially if there's buffering going on, or different scan velocities, or scan-conversion (e.g. DLP, Plasma subfields), or other differences in scan. But many eSports displays are capable of symmetric cable-vs-scan (in "Instant Mode" or their vendor equivalents).
Using VSYNC ON or FastSync fixes lag asymmetry for strobing, while VSYNC OFF fixes lag asymmetry for CRT/non-strobed/etc.
-- VSYNC ON/FastSync is global (like strobing flash is always global)
-- VSYNC OFF is sequential (tearing on the fly in scanout) which can be in sync with CRT scanout and LCD scanout.
This keeps lag differentials consistent for top/center/bottom of screen, so there's no difference between top edge of screen or bottom edge of screen.
Scanout latency can be symmetric with cable scanout latency (especially at ultra-high frame rates, lag jitter error is always (1/Hz) so lag jitter of VSYNC OFF progressivesly decreases the higher the framerate goes, until 1000fps turns VSYNC OFF lag jitter error margin to just only 1ms). Because each frame is 1/1000sec long at 1000fps, the lag gradient between top/bottom edges of frame slices (between two tear lines) is only 1ms apart, and as a result, with randomly-placed frame slices, your lag jittering is only 1ms at 1000fps. However, if you are playing VSYNC OFF 100fps, your lag jitter is 10ms -- big enough to throw off aiming. This is another reason why headshots are easier at higher frame rates -- even at frame rates far higher than refresh rate. Even 300fps versus 1000fps is quite noticeable in aiming precision during 144Hz, since you're turning 1/300sec lag jitter into 1/1000sec lag jitter.
TL;DR: "It's more complex than the questions you're asking". There are complex topics like lag differences/symmetry for top/center/bottom of screen and lag gradients of frame slices (between tear lines) and lag symmetry (between cable scanout versus panel scanout). All of them affect lag jitter, and can cause different lag jitter for different parts of screen. Lag is a VERY complex topic. Many of the "subtle" questions being asked are far beyond the scope of GSYNC101 series. Our coverage of various input lag angles will increase as time goes on. Keep tuned.