Page 3 of 4

Re: How many frames of lag are there with the following

Posted: 04 Sep 2017, 17:46
by drmcninja
Also your tests indicate that Fast-Sync with 118 fps cap and 300 fps cap at 120Hz have identical input latency. They feel different to play is what I'm saying.

Re: How many frames of lag are there with the following

Posted: 04 Sep 2017, 18:00
by jorimt
drmcninja wrote:The end effect of having 240 mouse inputs a second being reflected in your GPU's processed frames seems to be an increased granularity/fineness to the crosshair movement.
Nope. An "increased granularity/fineness to the crosshair movement" would require more frames to be actually rendered on screen for you to see and react to. With Fast Sync 120Hz/240 FPS, there aren't more than 120 frames a second being displayed, and positioning differences wouldn't make a difference either.

G-SYNC + V-SYNC + 117-118 in-game FPS limit (with sustained 120+ FPS) at 120Hz will always delivery the same frames faster and more consistently than Fast Sync + uncapped FPS (in Overwatch with it's 300 FPS hard limit; under 5x ratio) at 120Hz.

And even with G-SYNC, the in-game limiter still allows frametimes to drop/framerates to exceed the FPS limit, which is how it reduces input latency over external limiters in the first place (effectively the same thing uncapped Fast Sync does).

You are testing with the in-game fps limiter, right?
drmcninja wrote:Also your tests indicate that Fast-Sync with 118 fps cap and 300 fps cap at 120Hz have identical input latency. They feel different to play is what I'm saying.
That's because Fast Sync at 120Hz/118 FPS is going to be repeating occasional frames (like traditional V-SYNC with an FPS limit below the refresh rate when it misses a fixed delivery window), whereas Fast Sync at 120Hz/300 FPS is going to be dropping excess frames instead.

G-SYNC does neither.

Re: How many frames of lag are there with the following

Posted: 04 Sep 2017, 18:35
by drmcninja
You're saying for the increased granularity, there needs to be quantitative improvement in the frames (more of them). I'm saying there are qualitatively better frames.

Take 120 frames rendered in one second by the GPU.

Take 240 frames rendered in one second by the GPU.

If you use every other frame from that second batch, you will see qualitatively a different set of mouse output between the two. It might be due to how the game engine processes mouse input.

Like this (V-Sync/G-Sync 120 would be 0, 8.33, 16.67):

At t=0, mouse position is 1.
At t=4.17ms, mouse position is 1.5 (Only possibly seen in Fast Sync)
At t=8.33ms, mouse position is 2. (For some reason, in Fast-Sync, this mouse position is actually different a lot of the time)
At t=12.51ms, mouse position is 2.5 (Only possibly seen in Fast Sync)
At t=16.67ms, mouse position is 3. (For some reason, in Fast-Sync, this mouse position is actually different a lot of the time)

In other words:

At 120Hz with V/G Sync:

At t=0, mouse position is 1.
At t=8.33ms, mouse position is 2.
At t=16.67ms, mouse position is 3.

At 120Hz with Fast Sync, using the exact same mouse input, something like this is happening:

At t=0, mouse position is 1.
At t=4.17ms, mouse position is 1.4
At t=8.33ms, mouse position is 1.9
At t=12.51ms, mouse position is 2.4.
At t=16.67ms, mouse position is 3.6.

(just using random values)

Somehow, that intermediate measurement changes the way it gets the following one. Mouse smoothing perhaps? My mouse is the Nixeus Revel which has the PMW3360 sensor and I have the updated firmware and it's running at 900 DPI, there should be as little smoothing as possible.

I just tried 300fps, 240Hz, Fast-Sync On. Initially I didn't use this because it was stuttery (since 300fps is not much above 240Hz). I took a minute or two to get used to that, then played, and it feels almost as good as 240Hz, 300fps no sync. I can aim the same way, hit headshots the same way. It feels way better in terms of aiming, despite the stuttering, than V-Sync On at either 240.004fps @ 240.011Hz (RTSS) or 238fps @ 240Hz (in-game limiter). There is definitely a lag difference between Fast-Sync On and Off at these settings, but with Fast-Sync On, I can aim almost the same, but there's less stuttering (because with 300fps/240Hz, no sync, it's very stuttery and tearing is noticeable). So I'm probably going to stick between Fast-Sync 300fps and No Sync 300fps for now.

According to your measurements, the input lag difference here is like 1ms if that. This isn't input lag, it's something else going on with the mouse.

Re: How many frames of lag are there with the following

Posted: 04 Sep 2017, 18:44
by drmcninja
This might be an effect similar to what the Reddit user described as "subframe input polling". Where even at 60fps, 60Hz, a game like Reflex can poll input data at 1000Hz so qualitatively the input in those 60 frames rendered by the GPU and sent to the display are different than when playing Overwatch at 60fps, 60Hz.

As fps increases, the difference decreases dramatically, but is still noticeable.

His test showed mouse input is being buffered into the next frame. I have a feeling this is what's messing with Fast-Sync and making it feel different.

So Fast-Sync 300fps (assuming your GPU/system are rendering at 300fps without a problem) gives you better mouse "feel", plus synced frames/refresh, at the expense of some input lag when compared to anything at a lower FPS.

Re: How many frames of lag are there with the following

Posted: 04 Sep 2017, 18:51
by drmcninja
https://www.reddit.com/r/Competitiveove ... _the_next/

See the edits in the original post.
CSGO and Quake Live is also tested to suffer from this issue, but uncapped framerate alleviates the issue at extremely high framerates. This is what was observed by u/3kliksphilip in his video, but he mistakenly attributed responsiveness to output latency. Output latency does contribute partially, but it is predominantly the timing granularity of your inputs that is the underlying mechanism behind the perceived, and actual, responsiveness at extremely high framerates. Output latency primarily affects perceived smoothness, while input latency directly influences responsiveness.
This guy in that video is referring to exactly the feeling I'm talking about.

Re: How many frames of lag are there with the following

Posted: 04 Sep 2017, 20:51
by jorimt
Until we can absolutely rule out the possibility that G-SYNC isn't working correctly on your laptop, and that Fast Sync is working (it has had issue with Overwatch in the past where when enabled it's actually just V-SYNC OFF) on your laptop, then all I'm doing is speculating about your personal experience.

What I'm trying to explain to you, is that Fast Sync is trying to achieve the same thing as G-SYNC, only it's doing it more poorly and by brute forcing its way there. If Overwatch's "mouse input is being buffered into the next frame," (which isn't uncommon in games), it would affect both methods equally (Fast Sync would, in fact, be more intermittent due to dropped frames, so maybe you're mouse appears smoother to you with more uneven performance? Mouse specs/settings by the way?).

If you fully understood the inner working of the two syncing methods, you'd understand why what you're posing makes no sense, at least if you are attributing the issue to the difference between those two specific syncing methods within those specific scenarios.

If the issue exists, it is being caused by something else.

Also, it's too bad your 240Hz monitor doesn't have G-SYNC for you to test against Fast Sync, because, again, I still can't rule out laptop/desktop G-SYNC implementation differences, a glitch, a driver issue, an improper setup, etc.

Re: How many frames of lag are there with the following

Posted: 05 Sep 2017, 15:49
by Chief Blur Buster
1. V-Sync On, FPS cap set to just below refresh rate (i.e, 120.012Hz refresh rate, 120fps cap)
2. V-Sync On, FPS cap set to 1 or 2 frames per second below refresh rate (i.e, 120Hz refresh rate, 118 or 119fps cap)
3. V-Sync On, G-Sync On, FPS cap set to 118 (and then 119) with 120Hz refresh rate
4. Fast-Sync On, FPS uncapped at 300, Automatically limited by Fast-Sync to 240-260 fps (with no FPS drops) at a 120Hz refresh rate.
5. V-Sync On, FPS cap set to max 300 (so you get the full V-Sync lag effect)
6. Lightboost ON.
My answers are consistent with Jorimt, "with a caveat of It's more complex than that". Remember, my answers to #1/2/3/4/5/6 are different for screen top-edge versus screen centre versus screen-bottom, thanks to scanout latency mechanics and lag gradients. But I'm going to aim at "average" latency for simplicity's sake.

1. In ideal situation, 1/2 frame delay (average screen-centre latency)
2. In ideal situation, 1/2 frame delay (average screen-centre latency)
3. In proper game engines (input read right before render & deliver), virtually no average lag difference. '
However min/avg/max lag will jitter a lot more for VSYNC OFF (= more stutteringt) with min lag for screen center / screen bottom will be less than GSYNC for places right below tearlines. Tearslices are lag gradients, and the top edge of a tearslice will be extremely low.
4. In an ideal situation, 1/2 frame delay (average screen-centre latency)
5. Varies hugely. I've seen VSYNC ON implementations with as little as 1-2 frames delay, but usually it's a god-awful 3 frames of delay.
6. Proper strobing adds average half refresh cycle latency delay, but can vary by strobe phase adjustment.
1-2ms delay for top edge, half refresh cycle delay for center, and full refresh cycle delay for screen bottom.

(But why old-fashioned LightBoost? Why not super-bright, super-beautiful strobing such as 144Hz ULMB on a 240Hz monitor. Blindingly bright and colorful in comparision to dim LightBoost)

Adding other angles to the FastSync vs G-SYNC situation:

Input lag is a very complex topic with lots of subtopics that go far, far beyond scope of discussion.

I should add another angle, that might be affecting things.

(1) Absolute lag (overall average lag feel)
(2) versus lag jitter (the min/max/avg spread)
(3) versus lag jitter regularity/erraticness (how the values fluctuate between min/max/avg)

All three different things, all of which affects aiming.

It's possible to make it easier to do a headshot with more input lag but less randomized lag jitter.
--> Who wants to play at 25ms lag with ±20ms of lag jitter (5ms-45ms erratic random lag).
--> It's easier to do a headshot at 50ms lag with ±5ms of lag jitter (45-55ms erratic random lag).

Also, all of these below have different aiming difficulties.
--> 25ms evenly-random lag with ±10ms of lag jitter
--> 25ms patterned-random lag with ±10ms of lag jitter
--> 25ms regular cyclic lag with ±10ms of lag jitter
--> 25ms of lag that is exact 10% of the time, random 50% of the time, with ±20ms of lag jitter

All the above can have identical min/max/avg of 15/25/35ms!
But very different aiming difficulties!

Visible lag-jitter patterns often only show up when doing thousands of measurements (rather than 10 or 40 measurements), basically measuring the exact lag of every single frame, not just lag of each event.

Remember, we're talking about fast mouse-movement (e.g. 8000 pixels/second) where 1ms lag random variance = 8 pixel mouse undershoot or overshoot -- so lag randomness (how often or regularly patterned the lag variability is) greatly affects aiming reliability. Flicking fast and reliably stopping the mouse exactly on target, rather than undershootin/overshooting = less time moving mouse again to minor-adjust to spot-on headshot.

Today, lag erraticness error is still often in the tens-milliseconds timescales even in eSports, but this is a poorly tested and controlled topic. Many eSports players attuned to CS:GO or Quake Live "ultra-low-lag-jitter" can easily get get frustrated at bad aiming in slower FPS games such as Battlefield 4 which has a lot more lag jitter than CS:GO.

Lag jitter might be regular (e.g. sawtooth, or a smooth ramp, etc) or erratic (e.g. regular random, erratic switching between regular & random, etc). High speed video tests or single-reading-measurements don't catch such subtle things, and single-event lag measurements (button lag measurements). But things like RTSS and/or photodiode oscilloscopes can do this under certain cases -- advanced lag-measuring / lag-extrapolating methods that measures or calculates the lag (or lag differentials) of every single frame, rather than just lag of single events (button presses). Those advanced lag measurement methods catches very detailed lag analyses that are revelatory to how easy/hard it is to aim -- erratic versus regular lag makes a huge difference in aiming accuracy. Converting random lag into cyclic lag or vice-versa can be beneficial (certain modes like Fast Sync might do that) from an aiming perspective.

We do not have that problem with CS:GO but it was common back in the Olden Days. We're used to precise lag behavior, but get frustrated by erratic lag -- imagine a very slow shotgun that randomly fires between 1 to 3 seconds after pressing the trigger (example: Trying to run Quake on an old 386). You don't know when the shotgun is gong to go "boom" if the lag is THAT random. On very, very, very slow systems (Quake on a 386) the shotgun sometimes feels like an erratic BFG900 on faster systems -- muzzle flash coming random seconds after fire button. Or if you started a big battle in Diablo 3 or on a really slow older system at its minimum computer requirements, then it suddenly churns at only 2 frames per second as it tries to access the hard disk suddenly, and your fire button erratically reacts 2-3 seconds later. Today, lag erraticness it's still a problem -- some game engines are so bad with random lag effects -- you'll get random lag effects in some games that are much worse than CS:GO. Even if it's down to hundreds-millisecond timescales, not the seconds-randomness timescales.

So.....I think lag jitter is a big effect beyond the scope of this thread and beyond the scope of GSYNC101. It often requires very advanced measurement techniques that most sites and gamers don't do (not even us). But you can bank on it: We'll be researching lag jitter (lag variability) more in the future, as it is an area of science very deserving of study.

Even beat-frequency-effects between mouse poll rate, frame rate, refresh rate, engine tick rate, etc, etc, all fight each other in creating amplified/reduced lag jitter/variability and what most people never consider -- is how the variability manifests itself. Predictable variability versus random variability makes a difference too.

This is beyond the scope of this thread, but, what's important is that this discussion may not consider the fact that Fast Sync may be benefitting from certain things that are beyond high-speed-video testing techniques.

It's possible to have less average lag but a wider min/max/avg spread.
And higher average lag but narrower min/max/avg spread.

Sometimes they're mutually exclusive. Slower systems (especially with lots of power management / thermal management / etc) will often have more lag jittering than desktop systems. Lag jitter (regularity & randomness) can also be impacted by better or poorer drivers, GPU brand, heavy power management overhead (forcing AC mode + Performance + lots of cooling can reduce lag jitter), slower CPU-versus-faster GPU, faster CPU-versus-slower GPU, etc.

For certain kinds of games, less lag jitter is more important than absolute lag. During racing and flight simulators, we often tolerate a lag in throttle/gas/steering response; we're used to it. Laggier system simply feels like a laggier car/plane which actually happens in real life, and we often adapt/adjust. And if we're not competing against online players, the absolute lag isn't as important as low-lag-jitter (erratic-lag steering wheel is "bad", predictable-lag steering wheel is "acceptable"). Depends on the game -- and whether or not we are competing against other humans online (or simply playing solo).

For non-GSYNC monitors, Fast Sync can improve experiences at frame rates far higher than refresh rates especially if you're trying to have a different lag-jitter behavior than VSYNC OFF. Also, whenever using ULMB -- I do notice that Fast Sync feels less microstuttery than VSYNC OFF when framerates far exceed refresh rate (e.g. 300fps @ 120Hz). So Fast Sync can be a lag-reducing compromise for improving strobed motion.

Jorimt is correct that Fast Sync is a brute-force way. It is simply the GPU rendering as rapidly as possible, and delivering only the freshest frame to the monitor. It certainly produces different lag jitter mechanics (which may or may not be better than VSYNC OFF for some people). Lag jitter and microstutter is dependant on framerate, so Fast Sync sometimes shines well at frame rates far higher than refresh rate -- in certain cases.

The great thing is Fast Sync is 100% compatible with ULMB, even though Fast Sync is indeed an inefficient brute-force technique. And if you're stuck with a choice of 300fps@240Hz VSYNC OFF versus 300fps@240Hz Fast Sync, there are some situations where Fast Sync looks preferable. Fast Sync is also useful as a lower-lag LightBoost/ULMB mode, since once it is given an overkill framerate (>300fps@120Hz), the microstutters greatly diminish albiet not gone, and it often looks better than VSYNC OFF during strobed modes since the microstuttering is almost the same (when at sufficiently overkill margins) yet the tearing is gone. ULMB/LightBoost does amplify the visibility of artifacts -- tearlines are easier to see without motion blur -- so Fast Sync becomes useful here too.

However, when not using ULMB (which you can't easily do with GSYNC on most monitors) -- when having a choice of GSYNC and Fast Sync, I've always found GSYNC to universally have the best aiming and "lower-lag-than-FastSync" experience (at frame rates less than refresh rate). Things do diverge when you can hit framerates far above refresh rate -- this is where VSYNC OFF and Fast Sync can help.

That said, even without ULMB or GSYNC:

Lag-jitter harmonic behavior and microstutter harmonic behavior are slightly different-behaving for VSYNC OFF versus Fast Sync, and aiming may even be easier with Fast Sync depending on user's vision.
-- 300fps@240Hz FastSync is a single full-frame-drop 1 out of 6 frames
240 frames visible, all refresh cycles are always complete frames
-- 300fps@240Hz VSYNC OFF gives you 300 "5/6th-screen-height" tearslices per second
You always have a portion of all 300 frames visible, as randomly placed tearslices, many of them overlapping two refresh cycles

For framerates lower than monitor max Hz, GSYNC tended fairly consistently lower lag than Fast Sync (when done properly!).
However, thing change when you go framerates far above refresh rate.... (like for our VSYNC OFF tests). 1000fps FastSync will still have less lag than say, 142fps@144Hz GSYNC for example. MIcrostutter do gradually diminishes with FastSync, the far higher the frame rate you go above refresh rate.

The big question is -- for some people, is capping to fps_max 238 at 240Hz GSYNC better -- or having access to Fast Sync 300fps at 240Hz?

Mathematically, GSYNC has the better numbers, but in real life, you might randomly have a fresher "button to pixels" frame during 300fps@240Hz. However, not all engines are created equal, and their input-read behaviors don't play very well with frame capping behaviors -- you might run into a game engine with an occasionally tighter framebuffer workflow (even if only 1 out of 10 times) during 300fps@240Hz Fast Sync, than the consistent framebuffer workflow that occurs with 238fps@240Hz GSYNC. I haven't tested for that scenario, but I understand programming, to see that edge cases like these might happen (....just covering bases here...). Sometimes the numbers lines up to the point where it may occur that consistent lag harmonics can mean easier aiming through bigger stutter. This is almost impossible to reliably test for, and the best thing that can be done is staring at graph data to analyze for patterns. (RTSS is one that gives a faint glimpse of the lurking complexity, but even that is woefully incomplete to capture the entire scope of this post).

At the end of the day, root causes of gaming-performance improvements can be far beyond the scope of the original post of this thread (and far beyond the scope of GSYNC 101). Especially when it comes to frame-by-frame lag and how the frame-by-frame lag varies (regularly or randomly), and how it all affects (improves/hurts) aiming. Confirmed edge cases happen that can definitely create better aiming at higher lag, or worse aiming at lower lag, etc.

There is a lot of science to study in input lag, and people have created full time careers over just the analysis of response/lag.

And hey, this is a tiny glimpse of the complexity of topic that input lag is! In this post, I have not even gone into topics such as "input lag gradients" of tear slices and stuff like that. We now have come up with a simplified filmreel metaphor diagram format that explains this much more easily to end users, and we will write more articles that explain the relationship of scanout latency, tearslice latency, lag-gradients of tearslices, and refresh cycle latency, and even accounts for the the blanking interval between refresh cycles (our trademark all-in-one filmreel metaphor). The time ruler at the right edge, also is a lag-gradient ruler too, for understanding the difference of input lag between top edge of a tearslice and bottom edge of a tearslice.

The Lag Gradient Effect of Frame Slices during VSYNC OFF

Image

The Lag Gradient Effect of Display Scanout

Image

And there's One More Thing. Lag Symmetry Complexity!

The symmetry between cable scanout versus display scanout. The way the pixels are transmitted over the cable (and VSYNC OFF takes advantage of that), versus the way the pixels are refreshed onto the screen. Scan on cable-vs-panel is not always symmetric, especially if there's buffering going on, or different scan velocities, or scan-conversion (e.g. DLP, Plasma subfields), or other differences in scan. But many eSports displays are capable of symmetric cable-vs-scan (in "Instant Mode" or their vendor equivalents).

Using VSYNC ON or FastSync fixes lag asymmetry for strobing, while VSYNC OFF fixes lag asymmetry for CRT/non-strobed/etc.
-- VSYNC ON/FastSync is global (like strobing flash is always global)
-- VSYNC OFF is sequential (tearing on the fly in scanout) which can be in sync with CRT scanout and LCD scanout.
This keeps lag differentials consistent for top/center/bottom of screen, so there's no difference between top edge of screen or bottom edge of screen.

Scanout latency can be symmetric with cable scanout latency (especially at ultra-high frame rates, lag jitter error is always (1/Hz) so lag jitter of VSYNC OFF progressivesly decreases the higher the framerate goes, until 1000fps turns VSYNC OFF lag jitter error margin to just only 1ms). Because each frame is 1/1000sec long at 1000fps, the lag gradient between top/bottom edges of frame slices (between two tear lines) is only 1ms apart, and as a result, with randomly-placed frame slices, your lag jittering is only 1ms at 1000fps. However, if you are playing VSYNC OFF 100fps, your lag jitter is 10ms -- big enough to throw off aiming. This is another reason why headshots are easier at higher frame rates -- even at frame rates far higher than refresh rate. Even 300fps versus 1000fps is quite noticeable in aiming precision during 144Hz, since you're turning 1/300sec lag jitter into 1/1000sec lag jitter.

TL;DR: "It's more complex than the questions you're asking". There are complex topics like lag differences/symmetry for top/center/bottom of screen and lag gradients of frame slices (between tear lines) and lag symmetry (between cable scanout versus panel scanout). All of them affect lag jitter, and can cause different lag jitter for different parts of screen. Lag is a VERY complex topic. Many of the "subtle" questions being asked are far beyond the scope of GSYNC101 series. Our coverage of various input lag angles will increase as time goes on. Keep tuned.

Re: How many frames of lag are there with the following

Posted: 05 Sep 2017, 22:01
by jorimt
This is beyond the scope of this thread, but, what's important is that this discussion may not consider the fact that Fast Sync may be benefitting from certain things that are beyond high-speed-video testing techniques.
Agreed.

And to clarify @drmcninja, I'm not saying there is nothing to what you're feeling, I'm saying that while differences between these syncing methods may reveal this behavior, the syncing methods themselves aren't necessarily the direct cause; more like a symptom that stems from multiple factors.

E.g. Breaking down the reasons for this issue is simply beyond the scope of an isolated "G-SYNC vs. Fast Sync" subject, and cannot be fully captured or quantified by my existing tests; other methods would have to be used.

Also, devoid all that, while I'm not aware of the specs/settings of your mouse, I would mention the increased consistency and smoothness of frame delivery with G-SYNC may expose the microstutter-y or laggy behavior in some mice with lower polling rates/senors inaccuracies where Fast Sync would mask some of these same issues with its own behaviors, especially at higher Hz, thereby potentially creating the illusion of more consistency or responsiveness with the latter:

Image

Read more here: http://www.blurbusters.com/faq/mouse-guide/

Not saying that's your issue, but it can be a factor as well.

Re: How many frames of lag are there with the following

Posted: 11 Oct 2017, 12:33
by [Unlisted]
I've read the "HOWTO: Low-Lag VSYNC" article and a couple of threads on the same topic, but I'm still left with some questions regarding fractional frame rate capping with RTSS.

1) The guide mentions using a difference of 0.01 as a starting point, while other topics mention values between 0.007 and 0.005. I understand that this is depends on the engine and computer specs, but would a smaller difference like 0.001 reduce the frequency of stutters more than a greater difference like 0.01? Are there any pros and cons using a higher or lower difference?

2) Are there any downsides to using this technique with games that are locked to 60 FPS (like Dark Souls 3 or Tekken 7), just to improve frame pacing? Would there be any conflict or negative performance impact, besides the slightly higher input delay? I know that in-game limiters reduce input delay more than RTSS, but most of the time this option isn't even revealed to the user and there is no way of knowing if the in-game VSync option is also limiting the frame rate.

Re: How many frames of lag are there with the following

Posted: 12 Oct 2017, 14:13
by RealNC
Eregoth wrote:The guide mentions using a difference of 0.01 as a starting point, while other topics mention values between 0.007 and 0.005. I understand that this is depends on the engine and computer specs, but would a smaller difference like 0.001 reduce the frequency of stutters more than a greater difference like 0.01? Are there any pros and cons using a higher or lower difference?
0.01 or 0.007 will give you the same results. It's not super exact; these are just examples. If you're using something between 0.006 and 0.015, you're fine. The frequency of stutters is so low, it doesn't make a difference whether you use 0.006 or 0.015.

0.001 (or anything below 0.005) is not recommended simply because RTSS might not be able to maintain that accuracy at all times, and when that happens, you get an additional buffered frame once in a while.

Using 0.01 will give you one stutter every 100 seconds (on average.) That's over 1 minute! 0.007 gives one stutter every 143 seconds on average. These frequencies are so low, they stop mattering. The duplicate frame you get every 100 seconds disappears in the noise of other skipped frames. The reason micro-stutter is annoying is because it has a frequency that you can detect. A skipped frame every second, or every two seconds, or even 5 seconds, sticks out. Once you get up to a minute, you can't even tell that there's a pattern anymore.

Even with games that appear to be running 100% microstutter-free 100% of the time, there *are* skipped frames now and then. But you simply can't tell because the frequency is so low that you can't detect the frame skips as a pattern anymore. Even if you have extremely well trained eyes for microstutter (I do), it's virtually impossible to consciously notice such rare frame skips while playing. You only notice them if they appear in a pattern/frequency that's not too far apart.

And even if you try to actively focus on an animation to try to spot every frame skip that occurs, it's very difficult to maintain that focus for 100 seconds at a time. So long story short: a microstutter frequency of 100 seconds (or 143 seconds) is for all intents and purposes not microstutter anymore and you don't notice that there's anything wrong.
Are there any downsides to using this technique with games that are locked to 60 FPS (like Dark Souls 3 or Tekken 7), just to improve frame pacing? Would there be any conflict or negative performance impact, besides the slightly higher input delay? I know that in-game limiters reduce input delay more than RTSS, but most of the time this option isn't even revealed to the user and there is no way of knowing if the in-game VSync option is also limiting the frame rate.
In-game limiters aren't accurate enough for this. They could be if they wanted, but I don't think any of them try to actually be that accurate. RTSS is super-accurate though. For the "low latency vsync" method, RTSS is the only choice. For a 60FPS locked game, the recommended approach is to make sure your monitor runs at slightly above 60Hz (60.008Hz, for example), and then use RTSS to cap to exactly 60. When the in-game limiter overshoots the cap, RTSS takes over and prevents the vsync buffer build-up.