Question Please help with understanding FPS stability issue in Competitive Gaming - Fortnite on High End PC

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
dendu
Posts: 26
Joined: 08 Aug 2020, 19:06

Re: Question Please help with understanding FPS stability issue in Competitive Gaming - Fortnite on High End PC

Post by dendu » 14 Sep 2020, 22:19

TIGEREXPERT wrote:
14 Sep 2020, 11:42
[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Multimedia\SystemProfile]
"NetworkThrottlingIndex"=dword:ffffffff
"SystemResponsiveness"=dword:00000000
From my testing, these two settings are more of a tunable, not just something to disable.

Timecard's findings:
https://github.com/djdallmann/GamingPCS ... /README.md

Setting NetworkThrottlingIndex to decimal 15, feels good for me.

SystemResponsiveness, set to decimal 13, I think this is also like Win32PrioritySeparation where changes are realtime, so you can alt-tab from game to regedit and try incrementing up and down to find optimal value for your setup.

Win32PrioritySeparation, decimal 22, makes mouse control feel better.

TIGEREXPERT
Posts: 10
Joined: 29 Aug 2020, 22:11

Re: Question Please help with understading FPS stability issue in Competitive Gaming - Fortnite on High End PC

Post by TIGEREXPERT » 15 Sep 2020, 15:38

Chief Blur Buster wrote:
14 Sep 2020, 18:12

Battle(non)sense is the specialist in this stuff -- YouTube channel.
Thanks a lot for the reference. Just checked one of his videos "Get The Most Out Of Your G-Sync Compatible Monitor": https://www.youtube.com/watch?v=YR0vNs0ZdWI

In minute 4:50, Battle(non)sense suggests setting Vsync on in-game (vs. in NVCP), "as that might also enable additional game optimizations inside the game engine", as compared to blur busters suggestion to put Vsync On in NVCP (and off in-game) since "some in-game V-SYNC solutions may introduce their own frame buffer or frame pacing behaviors, enable triple buffer V-SYNC automatically (not optimal for the native double buffer of G-SYNC), or simply not function at all, and, thus, NVCP V-SYNC is the safest bet".

Question:


Does this mean, as gamers, we need to set Vsync On in NVCP or In-Game case by case (per game basis)? since some in-games' Vsync will behave differently (tripple buffers, specific optimizations) ? Meaning we will have to run input latency analysis for each option (Vsync in game vs in NVCP) for each game?

Personally, I found that VSYNC in NVCP felt less laggy than the In-Game option in Fortnite, may be this is just placebo, but I was really reassured by the Blur Busters conclusion. What I am 100% sure about, is that Vsync off feels more responsive at any situation (but the tearing is really hurtful to the eyes after long sessions of fast motion gaming, so I turn it on + FPS In Game CAP).

Will the introduction of Nvidia Reflex technology in Fortnite (apparently in two days or next week), help reduce the Vsync incremental input latency? Or are we going to have to wait to be able to purchase those 360hz monitors to be able to monitor end to end latency with more precision (including mouse + keyboard plugged to the monitor USB).

I really think it would be beneficial to the gamers community (350 million players for Fortnite, probably between 20 and 40 million active PC players) to have access to a comprehensive overall performance analysis + tweaking tool (or does this already exist?) allowing to:

- Run an AS IS performance: analysing a 30min or 1 hour gaming session in real conditions, measuring end to end input latency, via multiple lenses & metrics (including DPC interrupts, driver CPU affinity, voltage variation of GPU or CPU threads, RAM, frame time, virtual memory, all the relevant metrics, mouse/keyboard polling rate & variability..etc). *

- Set target objectives: offering profiles (like pro gamer, 4k gamer, per game.. with real benchmarks etc), settings objectives in terms of overall end to end latency, stability objectives

- Perform tweaks automatically or suggest manual tweaks at the different levels (can suggest specific hardware or software changes, based on benchmarks, but something more precise allowing anyone to undoubtfully say: "okay, I did everything possible locally to reach the target system latency objective, the system requires this hardware or software upgrade, from a real/measured performance & stability perspective / not marketing perspective).

This would not only be beneficial for gamers, but it could also be beneficial for Game developers (with all the system data collected, one could comfortably tell each game developer about the issues faced by the large majority of systems or specific systems, to help them fine tune their engine or game versions etc), beneficial for each device manufacturer (informing them for example about the issues present within their driver versions, for example, many Fortnite players believed for a long time that Nvidia driver V441.41 was the best for FPS stability, and never updated it. Personally I tested v441.41, and it made my FPS stability worse. Such kind of tools would confirm if this is true per system basis + benchmarks => Is this stability issue to be adressed by Nvidia, the game developer or the gamer. Who is responsible for what? may be it is the Network provider? the server provider? but at least we can conclude definitively about the gamer's system in the equation), beneficial for the Monitor/Mouse/keyboard device manufacturers, etc etc.

I believe that if 40 million PC gamers (if not more) can know exactly why their game feels so laggy (although they just got better hardware or newer software) and can tweak it with few clicks via robust/reliable tools, I am 100% sure the overall graphics & gaming ecosystem will be more efficient (gamers will know what to buy and not to buy, and why. They can test it themselves. Manufacturers will gather more precise and comprehensive technical feedback about their product versions and defaults, improve their quality assurance processes). Just thinking outloud, I thought such comprehensive accessible tool already existed in 2020, still looking for it though.

TIGEREXPERT
Posts: 10
Joined: 29 Aug 2020, 22:11

Re: Question Please help with understading FPS stability issue in Competitive Gaming - Fortnite on High End PC

Post by TIGEREXPERT » 15 Sep 2020, 15:53

Chief Blur Buster wrote:
14 Sep 2020, 18:12

It varies on the algorithms used. Most algorithms nowadays are used predictvely so you can move and turn without being throttled by the network. But if you're old enough to play synchronous networked games like NetDOOM (in year 1994), framerates were more locked to network quality back in those old days.

Then later games had certain movements that had no problems with certain movements (turning left/right with no lag) but lagged with other movements (e.g. moving forward was erratic framerate/lag).

Nowadays, network-to-framerate coupling effets are much more rare, but they can still exist with certain movements for certain games (e.g. laggy connection may only affect specific weapon behaviors or specific special movement type) -- it's a very game dependent behavior -- sometimes there's absolutely no coupling.

Your real-world local position can actually go out-of-sync with what the other players see, but the game engine tries its best with predictive algorithms to keep things in sync without inducing local latencies and/or framerate affecting behaviors.

This is common hitreg problem -- the visible target may be in different positions on all the different player screens -- and the correct registration of a shot can go wonky when everybody's latencies is different, because the algorithm prioritized your ability to move laglessly locally, despite other players having lagged movements. This didn't happen when movements used to be more synchronous in very old networked games (e.g. NetDOOM) which forced a major framerate-to-network coupling effect. This became impractical for the latencies and jitter of TCP/IP so now it's all asynchronous with predictive algorithms trying to keep things in sync (semi-successfully).
TIGEREXPERT wrote:
14 Sep 2020, 17:24
This means that if the network is horrible, say Ping is at 1000, and bandwidth = 0kb/sec, and packet loss = 100%, the FPS can still remain stable at 237 FPS during the network lag. Am I missing something in this thought process ?
That's how most modern games do it nowadays -- fully decoupled/synchronous behavior.

It does mean things go wonky, player positions being different on the different computer screens, with all the attendant hit-registration problems (you shot but it didn't score -- or you thought you missed but it actually hit/scored by chance)

Some games do a better job of compensating, and others do a worse job. Or sometimes it's a side effect of a playing-field-levelling algorithm (e.g. trying to keep things fair between all refresh rates, frame rates, etc). In some games, players with 10ms latency are automatically handicapped slightly compared to players with 30ms latency, if it's a server full of higher-lag players. These playing-field-levelling algorithms are frustratingly unpredictable, and vary hugely from game-to-game.

This happens less often if everybody has FTTH or everyone is on an LAN. For certain games, you can also always try to cherrypick servers containing players that are similar to your latency, but that will generally not automatically happen.

Playing game server roulette (if that is an option) can things to a certain extent, depending on your geographical location -- basically choosing to connect to preferred game servers that creates more balanced latency behaviors. For certain games, you generally prefer to play 30ms-vs-30ms-vs-30ms rather than a huge mix of latencies such as 10ms-vs-30ms-vs-100ms -- and on fast servers (so server-side performance jitter doesn't interfer with its own algorithmns). The network play can feel much better and more predictable.
TIGEREXPERT wrote:
14 Sep 2020, 16:46
Image

Does this seem normal to you?
I've seen better. It is unrelated to your framerate slowdowns, but you might as well also improve that a little bit.
-- Testing different USB ports can also have major effects, try comparing the USB2 ports versus USB3 ports (Even if mouse is USB1). I sometimes notice 1ms latency differences between different mouse ports, so mouse-utility test them to make sure you are plugged into the best possible port that is not shared with other USB traffic. Don't plug a high-bandwidth device to the port immediately above/below the mouse port, as they often share the same internal USB hub.
-- Testing a different brand mouse can help. You want a lot of headroom in your mouse sensor.

P.S. Unrelated, but I am personally a fan of 1600dpi+ (at half in-game mouse sensitivity versus 800dpi) in the 240Hz refresh rate race, since mouse slowturns can feel smoother / less steppy -- the higher the refresh rate, the more important tweaking mouse DPI is. However, this is likely not your weak link, and older sensors don't do 1600dpi very well. I even now recommend 3200dpi in certain situations (if that rate is available non-interpolated -- as in newest sensor). Your mouse sensor probably works best at 800dpi since it is a 2017 sensor, so it might be time to test out a different mouse sensor to see if it performs the 1000Hz better.
1. Okay, this history evolution and algorithm adaptation topic is very interesting. This is exactly what I experienced playing fornite. If network goes off, the player will continue its movement, FPS does not change, then when network comes back, the overall situation changes, gets updated fast.

2. Okay, so I will get a new mouse and test it. I imagine that a good polling variability at 1000hz should be within 5hz range correct?

3. Oh, I never really understood why high sensitivity gamers preferred to have 1600DPI with 10% X&Y, vs 800 DPI with 20% X&Y.
Since the metric to be considered is EDPI (DPI x Polling rate), and both scenarios are equal (EDPI = 160), how come having a higher DPI or Lower DPI assuming the same EDPI (since X&Y sensitivities will be different 10% for 1600DPI vs. 20% for 800DPI) change anything? I also thought that having a lower DPI means more precision when moving the mouse slightly.
If I switch to 1600DPI x 10% (maintaining the same EDPI of 160) will that change anything in aiming performance?

I got Logitech G203, because of its weight, it is very light and the clicks are solid. The mouse can go up to 3200 DPI. By the way I tested with Mouse Tester 3200DPI and got the same graph of frequency variability as for 800 DPI (arr 200hz variation). So I will change it and evaluate the difference. Thanks for the tip.

TIGEREXPERT
Posts: 10
Joined: 29 Aug 2020, 22:11

Re: Question Please help with understading FPS stability issue in Competitive Gaming - Fortnite on High End PC

Post by TIGEREXPERT » 15 Sep 2020, 16:02

Chief Blur Buster wrote:
14 Sep 2020, 12:49
You may need to break down your post into multiple smaller posts to get some quicker answers --

I see you tried to push full power and 100% CPU, so you've already tried that.

Yes, mouse poll Hz can have some really weird effects sometimes -- but mainly with the framerate feels and lagfeels (in more of a "Milliseconds Matter" manner where delayed polls are coalesced with the next poll, reducing mouse accuracy) -- but not necessarily much effect on absolute framerate. But it can have a motion-fluidity degrading effect and a lagfeel-effecting effect much like poor framepacing (poor poll pacing -> poor frame-position pacing, since gametime positions are often sync'd to mouse polls).

Also, do you have metallic thermal paste on your CPU? If you are seeing major changes in CPU clockspeed in realtime, you ideally want to minimize the wild CPU clockspeed gyrations as much as possible, while keeping clocks consistently high, and better paste with a watercooler block can help with that. This might not be your problem, but a utility to monitor your CPU clockspeed gyrations may be useful to diagnose the problem.

A CPU can rapidly change clockspeeds on its own volition, whether for power management (low utilization) or for protection (briefly thermally throttle) or to boost a single-core at higher speeds when other cores are idle (high speed single thread) -- some of it not affected by the granular power management settings accessible in Windows. The problem is the transitions between these states, when done sufficiently many times per second, can create lagfeel changes in rare conditions -- especially when they cause stalls/problems (faults).

Some (very) left field suggestions, that might or might not work:

** Test a very slight underclocking! CPU upclocking/downclocking can affect latency, so try a test-underclock of all your equipment, to force a stability -- clockspeed changes can actually happen many times per second, creating diffuse latency effects, happening too fast to be noticed on a realtime CPU clockspeed counter. Lock the CPU Hz, preventing it from going too low or too high. See what happens to latency. Stable 3 Ghz all-cores clock might sometimes more latency-stable than a wildly-gyrating 1GHz-5GHz swing. Overclocking can amplify these gyrations as CPUs can throttle portions of its silicon to keep within thermal envelopes, so while you see better performance in some apps, you might see worse performance in other apps when near the limits. The Intel Extreme Tuning Utility can help you lock an all-cores underclock. Disable all the features that causes clockspeed changes, including the Turbo Boost feature. Turbo Boost is useful but it doesn't stay on consistently during high-heat situations, your goal is consistency. Think about it this way -- even if your best game moments become 1% worse, your worst game moments will be more than 10% better (choose any numbers, but you get the idea) -- a worthwhile tradeoff of a slightly lower all-cores forced clock.

** Possibly, try temporarily running a CPU-hog application in the background (but keep only one or two cores busy), see how it affects variability. Some CPU stress test apps may allow you to configure them to rev just one core. This may force stable clocks on the rest of the CPU for the game. This may be done simultaneously with an intentional underclock.

There are many other unrelated tests that may work better than this, but sometimes this is a step-by-step to rule out possible causes. There are totally different troubleshoot paths.
I just tested underclocking the CPU and deactivating all variability features via QuickCPU, since Xtreme Utility did not allow me to turn off Turbo Boost (tried everything, could not figure out why. I have the latest Xtreme version).

So I disabled:
- Turbo Boost
- Enhanced SpeedStep
- Hardware duty cycling
- C1E
- Bi-Directional PROCHOT
- Set FIVR Control: Voltage mode: static

I also set the speed shift (on all cores) minimum & maximum at 3.6Ghz (which is the core clock).

Tested two games in fortnite (uncapped vs capped), monitored the performance via MSI Rivatuner, although the CPU per core usage went up (30% instead of 20%), the fps max was the same, and the average fps + fps drops were exactly same (stability wise) as for 4.7Ghz with all features enabled.

May be I did something wrong, but my feeling (not a 100% sure), at this stage, is that the issue is not related to the CPU. I will disable only C1E, as apparently this impacts DPC Latency a lot.

TIGEREXPERT
Posts: 10
Joined: 29 Aug 2020, 22:11

Re: Question Please help with understading FPS stability issue in Competitive Gaming - Fortnite on High End PC

Post by TIGEREXPERT » 15 Sep 2020, 16:06

jorimt wrote:
14 Sep 2020, 21:17
TIGEREXPERT wrote:
14 Sep 2020, 17:24
However, the curious thing is that this FPS Drops issue is very recurrent and nearly systematic, and it happens always the same way, during the same game moments, nearly at the magnitude. And these FPS Drops disappear completely most of the time, in most of game moments.
Not saying it's ultimately the case, but the fact that this is repeatable in the same places, at the same times, regardless of settings (or the system in question) points even further still to disk access for asset retrieval in this particular game, possibly both on the server (their system) and local (your system) side.

Could also be something else entirely. It's hard to say when focused at such a granular-level as this.
Yes I see exactly what you mean. This is why I am analysing to see whether a tool, or set of tools, or pieces of advice can help determine if it is necessary to upgrade or tweak hardware or software locally to enable stable performance in Fortnite, or just accept fact (external/ non controllable problem), and wait or hope for future enhancements (such as wait for Fortnite to improve their game engine, as some content creators suggest, or wait for new Nvidia driver or... with no guarantee at all).

TIGEREXPERT
Posts: 10
Joined: 29 Aug 2020, 22:11

Re: Question Please help with understanding FPS stability issue in Competitive Gaming - Fortnite on High End PC

Post by TIGEREXPERT » 15 Sep 2020, 16:34

dendu wrote:
14 Sep 2020, 22:19
TIGEREXPERT wrote:
14 Sep 2020, 11:42
[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Multimedia\SystemProfile]
"NetworkThrottlingIndex"=dword:ffffffff
"SystemResponsiveness"=dword:00000000
From my testing, these two settings are more of a tunable, not just something to disable.

Timecard's findings:
https://github.com/djdallmann/GamingPCS ... /README.md

Setting NetworkThrottlingIndex to decimal 15, feels good for me.

SystemResponsiveness, set to decimal 13, I think this is also like Win32PrioritySeparation where changes are realtime, so you can alt-tab from game to regedit and try incrementing up and down to find optimal value for your setup.

Win32PrioritySeparation, decimal 22, makes mouse control feel better.
Thank you for your advice. I will try these tweaks and see how they impact latency.

I noticed that NDIS.SYS had the highest execution of 0.242277ms and 6397 DPC counts (5min analysis time in LatencyMon). TCPIP.SYS at 0.23ms. When changing the Network throttling & System responsiveness, it did not vary that much. I read somewhere that the DPC latency to aim for is below 0.10ms is that correct? Shall I tweak these registry lines with the objective to reach max DPC <0.10ms? What is a good latency for NDIS.SYS, TCP.SYS, and even DXKRNL.SYS, NTOSKRNL.SYS, NVDDMKM.SYS to aim for? Besides performing the right Registry tweaks, and CPU Affinity interrupt assignments, are there other locations system areas / tools where to influence the latency of these drivers.

Thanks

dendu
Posts: 26
Joined: 08 Aug 2020, 19:06

Re: Question Please help with understanding FPS stability issue in Competitive Gaming - Fortnite on High End PC

Post by dendu » 15 Sep 2020, 20:38

TIGEREXPERT wrote:
14 Sep 2020, 11:42
LatencyMon (for Driver & Processes DPC latency monitoring), all is green and seems to work perfectly.

Besides performing the right Registry tweaks, and CPU Affinity interrupt assignments, are there other locations system areas / tools where to influence the latency of these drivers.
I wouldn't spend that much time trying to lower dpc latencies, as long as no misbehaving driver causing major spikes its fine.
TIGEREXPERT wrote:
14 Sep 2020, 11:42
- What is the best tool or set of tools to perform a precise root-cause analysis for FPS drops. I want to be able to view the precise moment when the FPS dropped, and pin point the exact relevant factor that brought the performance decrease: power voltage? CPU? Windows or Device Driver? Specific Process? Since this problem is recurring, meaning, the FPS always drops the same way in the same situations (for example crowded areas).
I've never tried this app, but maybe it can offer something else than those you mentioned.
https://github.com/CXWorld/CapFrameX

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Question Please help with understading FPS stability issue in Competitive Gaming - Fortnite on High End PC

Post by Chief Blur Buster » 16 Sep 2020, 12:20

TIGEREXPERT wrote:
15 Sep 2020, 15:53
3. Oh, I never really understood why high sensitivity gamers preferred to have 1600DPI with 10% X&Y, vs 800 DPI with 20% X&Y.
Since the metric to be considered is EDPI (DPI x Polling rate), and both scenarios are equal (EDPI = 160), how come having a higher DPI or Lower DPI assuming the same EDPI (since X&Y sensitivities will be different 10% for 1600DPI vs. 20% for 800DPI) change anything? I also thought that having a lower DPI means more precision when moving the mouse slightly.
If I switch to 1600DPI x 10% (maintaining the same EDPI of 160) will that change anything in aiming performance?
Wrong metric.

This is not the important metric for the reasons of high DPI. While what you say is important, there's an additional factor: Jitteriness during mouse slowturns (not mouse fastturns).

When you have really low mouse sensitivity and really high in-game sensitivity (Temporarily, test 400dpi + really high in-game sensitivity). When you do that, a small mouse movement (1dpi) can translate into a big pixel jump, like 10 pixel on-screen movement.

So your mouse turns are steppy-steppy. So if you move your mouse only 1/8th inch over the period of 1 second (Common Use Case: Tactical slowscan through a gun scope. etc), there are only 50 mouse positions (1/8th of 400dpi), and your mouse slowturns will only run at the 50 frames per second because there's no intermediate positions. So you sabotage the frame rate of your refresh rate race... Scan slower (1/16th inch), 25 frames per second, scan even slower, steppy steppy as the sharp corners of 8-bit pixels, ouchie.

Different games will convert 1dpi to different degrees of on-screen movement, depending on sensitivity settings, and not necessarily less than 1 pixel. But you do really want to oversample it, because 3D games render things in subpixels. Even a 0.5 pixel movement is noticeable, if you're using a sufficiently low rez display (1080p or less).

Result: Your mouse slowturns become steppy-steppy.
(Test it now: Configure to 400dpi, then slide in-game sensitivity really high, then mouseturn really slow at millimeters per second. Notice the steppy-steppy-steppy effect?)

Increasing refresh rates + increasing screen resolutions, means DPI needs to go up to keep up with the refresh rate race, especially for
- Increasing refresh rate means the positions-per-second can become more frequently lower than the frames-per-second/refreshes-per-second.
- Increasing screen resolutions means the same physical inch can have more onscreen pixels; making dpi limitations more visible
As the Vicious Cycle Effect continues, this needs to push up mouse poll Hz AND mouse DPI simultaneously, to keep up.

If you want your mouse slowturns to be silk ultrasmooth on a retina display at high refresh rates, in this refresh rate race -- thusly, mathematically it sounds logical that you definitely you need to shovel-on the DPI -- a good mouse will feel the same at 1600dpi + quarter sensitivity, as 400dpi. Good mice will be able to have fast flickturns that feel the same quality and same speed, as long as you adjust sensitivity-versus-DPI. But your slowturns will skyrocket in quality.

This is less important if you use low in-game-sensitivity with low-DPI and prefer to swipe your mouse many inches to move onscreen more slowly, but quality degradation of mouse slowturns are getting above human visibility noisefloors as the refresh rate race continues. The benefit of 144Hz versus 360Hz becomes less visible when a computer mouse is the weak link... 400dpi was fine when we were playing 640x480 60Hz GLQuake but feels much more jittery for 2560x1440 240Hz, resolution and refresh being much higher...

Depending on the game (not all of them) Now when you view scenery through a gun scope, it's sometimes as if you jacked the sensitivity slider higher -- revealing the DPI weak links when you're scanning scenery slowly through a gun scope. Assuming mouse sensitivity and mouse scope sensitivity are adjusted as such -- but most games use lower scope sensitivity, though and sometimes they are independently adjustable (much lower sensitivity for scope has been advantageous). However, regardless of DPI, there can be various situations that pop up in games that suddenly need more DPI for a moment. Activities that force slow mouse movements to do a specific objective, and higher DPI can make these more precise.

The best DPI is a DPI that translates to a tiny subpixel movement (to overcome all Nyquist/aliasing factors where possible) even for your slowest mouse slowturn, with enough positions per second to keep things jitter-free (even 0.5 pixel jitter can be visible, thanks to the way 3D graphics are rendered). With no effect/degradations of your other (faster) mouse movements (due to dpi-versus-sensitivity balance). Ideally, you want multiple mouse dpi per onscreen pixel even for slowturns, though there are diminishing returns when pixels become tinier.

The problem is higher DPI is often interpolated on many sensors. So you want a really good recent sensor that can do 1600dpi (or sometimes 3200dpi) accurately non-interpolated. Since the dpi-interpolation behavior can add a tiny bit of latency and mousefeel issues that some esports players can feel. It's one old-fashioned reason why many said to use 400dpi and 800dpi, but unbeknownst to many, a lot of mouse now accurately do 1600dpi and sometimes 3200dpi just as accurately now. Just don't crank to max (8000dpi or 12000dpi or whatever) since that's often interpolated territory. What you want is a DPI that doesn't degrade during fast flick turns. You don't want to feel the difference during normal fast flick turns.

It does mean your mouse pointer is really superfast when you exit the game, but you can use DPI buttons or mouse-profiles (steam.exe or game exe detection) to switch DPI when you launch/exit games.

This is off-topic from the original problem about FPS stability and has nothing to do with your FPS stability, but wanted to explain another reason why high DPI is useful.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

TIGEREXPERT
Posts: 10
Joined: 29 Aug 2020, 22:11

Re: Question Please help with understading FPS stability issue in Competitive Gaming - Fortnite on High End PC

Post by TIGEREXPERT » 16 Sep 2020, 19:54

Chief Blur Buster wrote:
16 Sep 2020, 12:20
TIGEREXPERT wrote:
15 Sep 2020, 15:53
3. Oh, I never really understood why high sensitivity gamers preferred to have 1600DPI with 10% X&Y, vs 800 DPI with 20% X&Y.
Since the metric to be considered is EDPI (DPI x Polling rate), and both scenarios are equal (EDPI = 160), how come having a higher DPI or Lower DPI assuming the same EDPI (since X&Y sensitivities will be different 10% for 1600DPI vs. 20% for 800DPI) change anything? I also thought that having a lower DPI means more precision when moving the mouse slightly.
If I switch to 1600DPI x 10% (maintaining the same EDPI of 160) will that change anything in aiming performance?
Wrong metric.

This is not the important metric for the reasons of high DPI. While what you say is important, there's an additional factor: Jitteriness during mouse slowturns (not mouse fastturns).

When you have really low mouse sensitivity and really high in-game sensitivity (Temporarily, test 400dpi + really high in-game sensitivity). When you do that, a small mouse movement (1dpi) can translate into a big pixel jump, like 10 pixel on-screen movement.

So your mouse turns are steppy-steppy. So if you move your mouse only 1/8th inch over the period of 1 second (Common Use Case: Tactical slowscan through a gun scope. etc), there are only 50 mouse positions (1/8th of 400dpi), and your mouse slowturns will only run at the 50 frames per second because there's no intermediate positions. So you sabotage the frame rate of your refresh rate race... Scan slower (1/16th inch), 25 frames per second, scan even slower, steppy steppy as the sharp corners of 8-bit pixels, ouchie.

Different games will convert 1dpi to different degrees of on-screen movement, depending on sensitivity settings, and not necessarily less than 1 pixel. But you do really want to oversample it, because 3D games render things in subpixels. Even a 0.5 pixel movement is noticeable, if you're using a sufficiently low rez display (1080p or less).

Result: Your mouse slowturns become steppy-steppy.
(Test it now: Configure to 400dpi, then slide in-game sensitivity really high, then mouseturn really slow at millimeters per second. Notice the steppy-steppy-steppy effect?)

Increasing refresh rates + increasing screen resolutions, means DPI needs to go up to keep up with the refresh rate race, especially for
- Increasing refresh rate means the positions-per-second can become more frequently lower than the frames-per-second/refreshes-per-second.
- Increasing screen resolutions means the same physical inch can have more onscreen pixels; making dpi limitations more visible
As the Vicious Cycle Effect continues, this needs to push up mouse poll Hz AND mouse DPI simultaneously, to keep up.

If you want your mouse slowturns to be silk ultrasmooth on a retina display at high refresh rates, in this refresh rate race -- thusly, mathematically it sounds logical that you definitely you need to shovel-on the DPI -- a good mouse will feel the same at 1600dpi + quarter sensitivity, as 400dpi. Good mice will be able to have fast flickturns that feel the same quality and same speed, as long as you adjust sensitivity-versus-DPI. But your slowturns will skyrocket in quality.

This is less important if you use low in-game-sensitivity with low-DPI and prefer to swipe your mouse many inches to move onscreen more slowly, but quality degradation of mouse slowturns are getting above human visibility noisefloors as the refresh rate race continues. The benefit of 144Hz versus 360Hz becomes less visible when a computer mouse is the weak link... 400dpi was fine when we were playing 640x480 60Hz GLQuake but feels much more jittery for 2560x1440 240Hz, resolution and refresh being much higher...

Depending on the game (not all of them) Now when you view scenery through a gun scope, it's sometimes as if you jacked the sensitivity slider higher -- revealing the DPI weak links when you're scanning scenery slowly through a gun scope. Assuming mouse sensitivity and mouse scope sensitivity are adjusted as such -- but most games use lower scope sensitivity, though and sometimes they are independently adjustable (much lower sensitivity for scope has been advantageous). However, regardless of DPI, there can be various situations that pop up in games that suddenly need more DPI for a moment. Activities that force slow mouse movements to do a specific objective, and higher DPI can make these more precise.

The best DPI is a DPI that translates to a tiny subpixel movement (to overcome all Nyquist/aliasing factors where possible) even for your slowest mouse slowturn, with enough positions per second to keep things jitter-free (even 0.5 pixel jitter can be visible, thanks to the way 3D graphics are rendered). With no effect/degradations of your other (faster) mouse movements (due to dpi-versus-sensitivity balance). Ideally, you want multiple mouse dpi per onscreen pixel even for slowturns, though there are diminishing returns when pixels become tinier.

The problem is higher DPI is often interpolated on many sensors. So you want a really good recent sensor that can do 1600dpi (or sometimes 3200dpi) accurately non-interpolated. Since the dpi-interpolation behavior can add a tiny bit of latency and mousefeel issues that some esports players can feel. It's one old-fashioned reason why many said to use 400dpi and 800dpi, but unbeknownst to many, a lot of mouse now accurately do 1600dpi and sometimes 3200dpi just as accurately now. Just don't crank to max (8000dpi or 12000dpi or whatever) since that's often interpolated territory. What you want is a DPI that doesn't degrade during fast flick turns. You don't want to feel the difference during normal fast flick turns.

It does mean your mouse pointer is really superfast when you exit the game, but you can use DPI buttons or mouse-profiles (steam.exe or game exe detection) to switch DPI when you launch/exit games.

This is off-topic from the original problem about FPS stability and has nothing to do with your FPS stability, but wanted to explain another reason why high DPI is useful.
WOW ! Thank you very much. You just helped me solve an issue I thought was inherent to the game itself when moving the mouse slowly, the image always vibrating. I actually thought it was either coming from the game or the monitor.

I just tested the 400DPI with 80% sensitivity in Fortnite and the image vibration/jittery effect increased drastically. It's exactly what I experienced with 800DPIx20%, but at a much higher magnitude. I also tested 3200PI with 5% (and 1600DPI with 10%), and it really feels very smooth, like exactly the way I was expecting the game to be in the first place. Never thought it was related to the DPI (no more vibrating).

It is very clear, thanks a lot! I am going to switch to 1600DPI. Then will get a newer mouse (with better sensors, non-interpolated rate, stable polling rate).

1000WATT
Posts: 391
Joined: 22 Jul 2018, 05:44

Re: Question Please help with understanding FPS stability issue in Competitive Gaming - Fortnite on High End PC

Post by 1000WATT » 17 Sep 2020, 04:18

I often do not clearly state my thoughts. google translate is far from perfect. And in addition to the translator, I myself am mistaken. Do not take me seriously.

Post Reply