2000Hz vs 4000Hz [Mouse]

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: 2000Hz vs 4000Hz [Mouse]

Post by Chief Blur Buster » 23 Dec 2023, 04:18

The DPI Consideration: Sabotaging Slow Track Frame Rates With Low DPI & Competing With Tomorrow's Competitive Players In Newer Games

About poll rate: 2000 Hz works with more games than 4000 Hz. 1000-vs-2000 is more noticeable than 2000-vs-8000 (I can still see improvements to mouse cursor behavior).

One big problem with 400 and 800 is legacy game engines and legacy mouse mathematics, like the older CS:GO engine, especially at many settings.

Modern great mouse sensors do a great job of 1600dpi in newer engines (e.g. Valorant, Overwatch). 1600dpi at quarter sensitivity relative to 400dpi, while adjusting Windows mouse sensitivity (bad for old games, harmless for new games) to keep pointer speeds the same as before -- allows you to slowtrack sniper much more accurately, while keeping your fast flicks unaffected. If you pay attention to adjustments at places like https://mouse-sensitivity.com/ and carefully configure settings, it's possible to get great 1600dpi in the older Source1 engines (and older), but I wouldn't chance it, 800 seemed the comfortable maximum with CS:GO when I was playing it, even while Fortnite felt quite perfectly fine at 1600dpi+. Newer engines are wayyyyy more universally accurate at higher DPI at whatever odd sensitivity settings you use.

If you slow track (sniper, slowpans, target tracking, aiming at airplanes, etc) in NEWER engines -- then you should know that you can do better nowadays. 400dpi = your 0.25 inch slowtracks runs at only 100 frames per second (one-quarter of 400dpi is 100, and that's only 100 positions). You're sabotaging slowtrack framerate with 400dpi, but whatever floats your boat -- for proper championship money in older game engines. As you are hitting 360Hz, the framerate sabotage of 400dpi starts to become noticeable unless you're a super-low-game-sensitivity user (e.g. moving your whole arm over a whole foot to flick turn), then you'll always never be moving 0.25 inch at a time while camping as a sniper slowtracking a target. Then 400 can be fine. Regardless, 400 increasingly sabotages mouseturn/mousepan framerate below refreshrate, and the main purpose of 400 is simply to keep latency low, muscle memory familiarity, and compatibility with (less and less oft used) older game engines with less precise mouse mathematics.

If you watch the professional esports teams -- they didn't ever go above 800 in CS:GO. But once you looked at other games, some of the teams were using 2000+ in games like Fortnite!

Mousepad quality is also a biggie. Some people are trying out those new nano textured glass mousepads to improve high-DPI function (3200dpi). Not all like it, but it's quite an interesting option for high-DPI users and wrist-flickers. But make sure you're windexing it now and then! And definitely keep your mouse feet clean.

When I talked to a Razer employee after I got the 8KHz sample, he confirmed the low-precision mouse mathematics in some older engines was observed.

Also, for me, 2KHz is usually the sweet spot. 4000 still starts to bog the CPU, and some engines hate even 1000. When it got first released -- the super buggy Cyberpunk 2077 even originally ran best at 500 Hz for me (mouse microstutters at 1000 Hz!!) but the engine was fixed and now it performs okay at default settings. At least on my machine. But it doesn't benefit as much, as say, a competitive FPS shooter. Use profile switching to blacklist high pollrates in engines that go wonky with it. Stick to 1000 for older engines, but go nuts with 2000-4000 for newer engines that has an efficient mouseloop. Just know that the computer is the limiting factor. When I got my first 1000Hz mouse in 2006 (more than 15 years ago), many games couldn't even keep up, and the same problem is happening now with 8000 Hz. It's beautiful when it works (very very noticeable when spinning mouse pointer in circles on Windows desktop), but in a game, the frequency bogs down the mouse loop bad.

This was the old advice (which still partially applies today)

Image

While you need to turn OFF Enhanced pointer precision, it's now safe to deviate away from Middle Setting *IF* you have stopped playing legacy games. This can help slow down your cursor in menus during 1600dpi+ operation, WHILE keeping your flick turns feeling exactly the same 400-vs-1600dpi (at compensated in-game sensitivity settings). At least in newer engines.

It used to be the case you HAD to avoid adjusting the Windows mouse sensitivity settings away from the dead-center setting. Say, you play World of Warcraft, it's got problems when you deviate from the middle!! But modern games ignore it completely EXCEPT for mouse pointering around (inventories in RTS games, game-join menus in FPS, flipping through menu screens in any game, etc), which makes it perfect in a bunch of newer games. If you stopped playing legacy games, it's now safe to adjust that setting to slow down your mouse pointer during 1600-3200dpi operation. The main problem is you have to familiarize with the exact notch that divide your mouse pointer speed by 2 or 4, if you double or quadruple your DPI. Otherwise, your mouse pointer will stay wonky.

Just remember to turn off "Enhance Mouse Pointer Precision"! (very very BAD -- pointer moves different speeds at 1000Hz vs 2000Hz vs 4000Hz). Even at the same DPI!

Once properly setting'd out 1600-3200 with compensated Pointer Speed and with compensated In-Game Sensitivity...
...you've got a "normal" slow mouse pointer (just like 400dpi days)
...and your fast flickturns feel unchanged (just like 400dpi days) as long as it's a good sensor, high resolution clean mouse pad, clean mouse feet.

But you gain a hidden new superpower from the high DPI operation: Your slowtracks become so much smoother and higher frame rate (no longer sabotaged by low DPI). The grainy slowtracks stops happening. However, YMMV depending on game. Some love the graininess because it's like "notched" movements, so if you need the notchfeel effect of a low-DPI slowtrack, then keep using low DPI. However, not all gamers are trained on that part, and the improved framerate of slowtracks wins out instead.

And with some of the esports leagues moving to 1440p recently (and possibly 4K in the future, longterm), more pixels = the dpi limitations show like a sore thumb during showtracks.

Syncing muscle memory between games is the HARD part -- when you've got a mix of old engines and new engines -- so sites like mouse-sensitivity.com helps with that sort of thing. Sometimes it's much easier to give up the obvious new-engine benefit of 1600+, and just stick to 400/800. But a lot of the new crowd now only plays Fortnite or Valorant or other bunch of newer FPS games -- and so, it's easier to start again with a 1600dpi+ 2000Hz clean sheet and reliably sync muscle memories better.

Currently, I'm a 3200dpi 2000Hz (wannabe-4000 if game engine actually kept up) casual gaming user simply because it's a fairly universal setting of convenience that I can easily sync now with most newer games (I don't play CS:GO anymore, and haven't yet tried CS2) and a wrist-flicker not an arm-sweeper. But I'm encountering more and more people who've settled on >800dpi sweet spots even for arm-sweeping.

Unfortunately SOME older games still feel better with 800. It just makes me stop playing them, ha. I really enjoy my >240 frames per second slowtracks (<1 inch/sec). No dpi-derived framerate sabotage for me, buddy. I want to _enjoy_ my Hertz _visually_ and 800 is garbage on 480 Hz OLEDs!

The Settings-configuring skillz of figuring out how to properly use 1600-3200 DPI is really hurt by legacy engines, so I understand if you need to focus your esports championship earnings at what works best for you. I only play casually for fun. But as games evolve, fewer players play legacy engines (including the forced CS:GO -> CS2 jump which is MUCH more 1600dpi friendly), and 1000Hz displays arrive -- be prepared to compete against gamers who have gained new superpowers like this one. We 400dpi dinosaurs can fail against the new kids that have learned to properly 1600dpi in new engines that does 1600dpi flawlessly. Just let that future sink ...er, sync?... in.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: 2000Hz vs 4000Hz [Mouse]

Post by Chief Blur Buster » 23 Dec 2023, 04:55

chandler wrote:
18 Dec 2023, 15:53
even though I saw many times that Chief and other people recommend 1600 - I feel like its unusable in windows
Semi outdated: Don't forget you can still have a super-slow mouse pointer at 1600dpi, with caveats*

*As long as you don't use any old game engine that goes wonky with Windows pointer sensitivity setting, and as long as you TURN OFF Enhance Pointer precision. That infernal muscle memory wrecker checkbox can burn in the same dumpster fire that year 2020 burnt in.

I used to recommend the dead-center Windows sensitivity setting, but that's actually advice only only applicable to game engines I no longer play.

Goldilocks 1600dpi or 3200dpi
The new mantra for ultra-accurate mouse in the NEWEST games is generally;
0. Important Assumption: You've ceased your last old-engine game that has erratic tracking at 1600dpi+ or wonky behavior if you touch Windows mouse sensitivity. You may want to hold off starting your high-DPI utopia unless you've migrated only to newer engines. That one game you depend on, wrecks the whole plan.
1. Windows: Turn off the "Enhance Pointer Precision" dumpster fire setting
2. Windows: Juice the next step below (Mouse DPI) while compensating "Windows Pointer Speed" downwards (symmetrically)
3. Mouse: Juice the mouse DPI exactly proportionally upwards with your decrease in "Window Pointer Speed" above.
3. Mouse: Juice the mouse Hz only as much as CPU/game permits you to do so without compromises; varies by game (1000, 2000, 4000)
4. Mouse: Upgraded mouse sensor that does 1600dpi as accurately as 400 or 800
5. Mouse: Upgraded mousepad that handles the high DPI well
6. Game: Lower in-game sensitivity perfectly symmetric to your DPI increase, to maintain your eDPI behavior;
*IMPORTANT: sometimes you have to manually edit config file for extra digits of precision (use mouse-settings.com), since going 400 -> 3200 may require you to use in-game sensitivity numbers such as "0.125" in a text .cfg file when the menu only allows you to do "0.1" or "0.2".

Goal of Ideal 1600dpi or 3200dpi:
- Slow Windows mouse pointer -- just like 400 or 800 dpi
- Slow Game Menu mouse pointer -- just like 400 or 800 dpi
- Fast Flick Turns feel the the same as 400 or 800 dpi
- But you gain new magic superpower of ultra smooth slowtracks that don't sabotage frame rate (e.g. firing a machine gun at an airplane flying past, or tracking moving object with sniper scope, or slowly panning an RTS landscape, or other slowtracks).

It's VERY important that the game you play, honors the "Pointer Speed" setting only with mouse cursor, WITHOUT any mousefeel changes during flick turns / mouselooks. Older games didn't always behave this way. It's easy to redherring blame the mouse (bad at 1600dpi), operating system (bad at 1600dpi), or the game (bad at 1600dpi).

They were ALL bad. It gave lots of yesteryear forum debates a good field day and lots of alibis to end flame wars -- because it is plausible deniability -- it takes only ONE weak link. But now, you can make those weak links dissappear, as long as you're playing only those games. Finally, high-DPI utopia.

Sure, maybe you aren't a camping sniper, and your FPS games doesn't have airplanes overhead. Or heck, maybe you muscle memoried on the notchfeel of low grainy slowtrack framerate (whatever floats your boat, I respect ya, dinosaur) -- the graininessfeel is like the notches of a slide ruler, to allow you to slowtrack a specific distance. But hey, watch what your competitor has trained on, m'kay? Slowtracks are extremely important in some of the games I play, and.... so the best-kept-secret superpower is important to me.

You're fighting against tomorrow's kids trained with the superpower. This is increasingly super important when BOTH(highHz + highRez) is occuring, e.g. 1440p 480Hz+. Remember that mouseturn/mousepan framerate sabotage factor of low DPI is real. Higher resolution means more pixels over an inch of display for the low DPI to sabotage the framerate across that span... As we move 1080p (common) -> 1440p (new standard) -> 4K (future) on similar size esports displays, we are doubling the PPI without doubling DPI.

And that wasn't enough.... "One More Thing".... the refresh rates are skyrocketing too on top of this all, with GtG's dropping like a rock (E-TN, FastIPS, and now OLED). So now, the low frameratefeel of low-DPI slowtracks starts to appear more and more often. 400dpi now makes me barf to the point I'm just not going to play those games that sabotage my slowtrack framerate. Sorry, game company.

*IMPORTANT: This high-DPI utopia is not always possible if you're playing ANY old game engine at specific muscle memory settings that don't math very well. You're lucky if you figured out how to make older engines like CS:GO behave well at 1600dpi, with nice round numbers in sensitivity settings all over the place. But if you're doing odd sensitivity settings with a few digits, the low-precision mouse mathematics in old engines will still make it go wonky. There's been a lot of debates but it boils down to "some sensitivity settings work okay at 1600dpi, some sensitivity settings make 1600dpi go weird" for older engines. While for newer engines it has become closer to or mostly "all sensitivity settings, even the odd sensitivity divisors, work perfect at 1600dpi, yay". Meaning 1600dpi becomes much more reliable. Even 3200dpi; it feels noticeably better in some games now!

As a software developer, I have noticed it's often an integer-vs-float-vs-double issue. Newest engines often use "double" (80bit IEEE754 floating point numbers widely used in C/C++/C#/Rust/x86/x64) for mouse mathematics now; even when later decimating to floats/integers at the final step. Still way better for interim math steps.


I long for software that automates it all (autoconfigures Windows, autoconfigures mouse, and autoconfigures game). You're trying to sync all 3 to make 1600dpi+ work better not worse than 800dpi, so it's still a goddamn house of cards.

The whole house of cards fall down easily, when you have that ONE game that annoyingly uses only a buggy rawinput for mouse cursor movements, or a different game that uses Windows Pointer Sensitivity for flick turns (like older WoW), and you have to postpone building your high-DPI utopia until the future to prevent wrecking your muscle memory. Fortunately, as time passes, those games that hold back progress, are either fixed, gain third party workarounds (e.g. custom settings / community patch), or just stops being played, and we can have more and more high-DPI utopia.

I wish the industry would adopt High Definition Mouse Extensions API Proposal.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
r0ach
Posts: 95
Joined: 10 Oct 2023, 14:45

Re: 2000Hz vs 4000Hz [Mouse]

Post by r0ach » 24 Dec 2023, 15:59

Chief Blur Buster wrote:
23 Dec 2023, 04:18
unless you're a super-low-game-sensitivity user (e.g. moving your whole arm over a whole foot to flick turn)
This is what's always bothered me about competitive FPS gaming. Essentially training yourself to play the least fun way possible in order to try and maximize aim advantage. In pro baseball they have everyone using the same wooden bats. I'd love to see 'move arm across the entire room to reach recycle bin players' forced to play with mouse settings that are actually optimized for normal desktop use. That and the further you drift away from standardized equipment the less of a 'sport' something is in the first place.

chandler
Posts: 130
Joined: 31 Aug 2016, 11:18

Re: 2000Hz vs 4000Hz [Mouse]

Post by chandler » 06 Feb 2024, 16:53

Chief Blur Buster wrote:
23 Dec 2023, 04:55
chandler wrote:
18 Dec 2023, 15:53
even though I saw many times that Chief and other people recommend 1600 - I feel like its unusable in windows
Semi outdated: Don't forget you can still have a super-slow mouse pointer at 1600dpi, with caveats*

*As long as you don't use any old game engine that goes wonky with Windows pointer sensitivity setting, and as long as you TURN OFF Enhance Pointer precision. That infernal muscle memory wrecker checkbox can burn in the same dumpster fire that year 2020 burnt in.

I used to recommend the dead-center Windows sensitivity setting, but that's actually advice only only applicable to game engines I no longer play.

Goldilocks 1600dpi or 3200dpi
The new mantra for ultra-accurate mouse in the NEWEST games is generally;
0. Important Assumption: You've ceased your last old-engine game that has erratic tracking at 1600dpi+ or wonky behavior if you touch Windows mouse sensitivity. You may want to hold off starting your high-DPI utopia unless you've migrated only to newer engines. That one game you depend on, wrecks the whole plan.
1. Windows: Turn off the "Enhance Pointer Precision" dumpster fire setting
2. Windows: Juice the next step below (Mouse DPI) while compensating "Windows Pointer Speed" downwards (symmetrically)
3. Mouse: Juice the mouse DPI exactly proportionally upwards with your decrease in "Window Pointer Speed" above.
3. Mouse: Juice the mouse Hz only as much as CPU/game permits you to do so without compromises; varies by game (1000, 2000, 4000)
4. Mouse: Upgraded mouse sensor that does 1600dpi as accurately as 400 or 800
5. Mouse: Upgraded mousepad that handles the high DPI well
6. Game: Lower in-game sensitivity perfectly symmetric to your DPI increase, to maintain your eDPI behavior;
*IMPORTANT: sometimes you have to manually edit config file for extra digits of precision (use mouse-settings.com), since going 400 -> 3200 may require you to use in-game sensitivity numbers such as "0.125" in a text .cfg file when the menu only allows you to do "0.1" or "0.2".

Goal of Ideal 1600dpi or 3200dpi:
- Slow Windows mouse pointer -- just like 400 or 800 dpi
- Slow Game Menu mouse pointer -- just like 400 or 800 dpi
- Fast Flick Turns feel the the same as 400 or 800 dpi
- But you gain new magic superpower of ultra smooth slowtracks that don't sabotage frame rate (e.g. firing a machine gun at an airplane flying past, or tracking moving object with sniper scope, or slowly panning an RTS landscape, or other slowtracks).

It's VERY important that the game you play, honors the "Pointer Speed" setting only with mouse cursor, WITHOUT any mousefeel changes during flick turns / mouselooks. Older games didn't always behave this way. It's easy to redherring blame the mouse (bad at 1600dpi), operating system (bad at 1600dpi), or the game (bad at 1600dpi).

They were ALL bad. It gave lots of yesteryear forum debates a good field day and lots of alibis to end flame wars -- because it is plausible deniability -- it takes only ONE weak link. But now, you can make those weak links dissappear, as long as you're playing only those games. Finally, high-DPI utopia.

Sure, maybe you aren't a camping sniper, and your FPS games doesn't have airplanes overhead. Or heck, maybe you muscle memoried on the notchfeel of low grainy slowtrack framerate (whatever floats your boat, I respect ya, dinosaur) -- the graininessfeel is like the notches of a slide ruler, to allow you to slowtrack a specific distance. But hey, watch what your competitor has trained on, m'kay? Slowtracks are extremely important in some of the games I play, and.... so the best-kept-secret superpower is important to me.

You're fighting against tomorrow's kids trained with the superpower. This is increasingly super important when BOTH(highHz + highRez) is occuring, e.g. 1440p 480Hz+. Remember that mouseturn/mousepan framerate sabotage factor of low DPI is real. Higher resolution means more pixels over an inch of display for the low DPI to sabotage the framerate across that span... As we move 1080p (common) -> 1440p (new standard) -> 4K (future) on similar size esports displays, we are doubling the PPI without doubling DPI.

And that wasn't enough.... "One More Thing".... the refresh rates are skyrocketing too on top of this all, with GtG's dropping like a rock (E-TN, FastIPS, and now OLED). So now, the low frameratefeel of low-DPI slowtracks starts to appear more and more often. 400dpi now makes me barf to the point I'm just not going to play those games that sabotage my slowtrack framerate. Sorry, game company.

*IMPORTANT: This high-DPI utopia is not always possible if you're playing ANY old game engine at specific muscle memory settings that don't math very well. You're lucky if you figured out how to make older engines like CS:GO behave well at 1600dpi, with nice round numbers in sensitivity settings all over the place. But if you're doing odd sensitivity settings with a few digits, the low-precision mouse mathematics in old engines will still make it go wonky. There's been a lot of debates but it boils down to "some sensitivity settings work okay at 1600dpi, some sensitivity settings make 1600dpi go weird" for older engines. While for newer engines it has become closer to or mostly "all sensitivity settings, even the odd sensitivity divisors, work perfect at 1600dpi, yay". Meaning 1600dpi becomes much more reliable. Even 3200dpi; it feels noticeably better in some games now!

As a software developer, I have noticed it's often an integer-vs-float-vs-double issue. Newest engines often use "double" (80bit IEEE754 floating point numbers widely used in C/C++/C#/Rust/x86/x64) for mouse mathematics now; even when later decimating to floats/integers at the final step. Still way better for interim math steps.


I long for software that automates it all (autoconfigures Windows, autoconfigures mouse, and autoconfigures game). You're trying to sync all 3 to make 1600dpi+ work better not worse than 800dpi, so it's still a goddamn house of cards.

The whole house of cards fall down easily, when you have that ONE game that annoyingly uses only a buggy rawinput for mouse cursor movements, or a different game that uses Windows Pointer Sensitivity for flick turns (like older WoW), and you have to postpone building your high-DPI utopia until the future to prevent wrecking your muscle memory. Fortunately, as time passes, those games that hold back progress, are either fixed, gain third party workarounds (e.g. custom settings / community patch), or just stops being played, and we can have more and more high-DPI utopia.

I wish the industry would adopt High Definition Mouse Extensions API Proposal.

I am sorry for the VERY late response as I have not been active here for the last month or so..

Ive read both your replies and just want to confirm that :

- I am using a Dav3 (wired)
- new clean QCK+
- have EPP ticked OFF since I can remember myself getting on my first PC
- I tried 1600 dpi for quite some time and Im starting to get used to it even in windows (frankly I have no desire to lower windows sensitivity notches from 6/11

my question is this - I saw that 800dpi isnt polling more than 2700~Hz, so it makes 4000 not very usable with anything less than 1600.
I just need to understand how much benefit do I gain from 4000hz versus 2000hz in-game in order to finally decide whether I should stay on 1600 or lower it to my beloved 800dpi and just play on 2khz - PLAYING ON XL2566K @ 360Hz

***CS2 ONLY**** - I dont play legacy games

chandler
Posts: 130
Joined: 31 Aug 2016, 11:18

Re: 2000Hz vs 4000Hz [Mouse]

Post by chandler » 19 Feb 2024, 18:36

especially with the game being very poorly optimized currently and the FPS after latest "big" patch was lowered by 100-120 on average on all PCs and systems. (with occasional stutters / even lower dips) I guess I need to "preserve" as many frames as I can, thats why I really need someone to explain the benefit of 4000 over 2000 to me

masneb
Posts: 239
Joined: 15 Apr 2019, 03:04

Re: 2000Hz vs 4000Hz

Post by masneb » 27 Feb 2024, 04:05

joseph_from_pilsen wrote:
18 Dec 2023, 21:29
The reason why hardcore gamers dont use highest dpi is simple, the ultra high settings of dpi will cause to your aim to be extremelly shaky. The sensor reads too many parasite moves at high dpi so the aim is shaky and not holding the track. I have been playing kinda long time with 3200dpi and 0.xx sensitivity and it was great for sniping (sniper rifles like scout or awp were extremelly responsive) but terrible for rifling, the crosshair was simply not holding altitude and was hard to control, due to parasite moves causing it to be extremelly sensitive to any micromoves. When i switched back to 800 dpi i felt like getting an aimbot. The only worse thing were less responsive snipas.
400 to 800 dpi is a good trade off, the crosshair stability is still good, and the mouse reacts noticably faster. 800 to any higher value like 1600 is not such a good trade, 3200 and more is already Parkinson simulator. Meanwhile mouse response lag decreases with diminishing returns, the amount of unwanted parasite moves added to your mouse movement increases exponentially with every dpi increment.
There is no such thing as 'parasite moves'. Higher the DPI, the more accurate your mouse becomes (assuming no interpolation or other shenanigins with the sensor or mouse firmware). That is your aim. Reducing DPI would make it so there is less input and your movement would become any less accurate. If the mouse is off, via input, the difference would be much greater with a lower DPI error compared to a higher DPI. If it's too sensitive, sensitivity could be turned down to compensate for that.

However, big caveats are that most games don't have proper support for decimal places for very high DPI/sens offsets and a lot of mice as well as setups do not handle lots of DPI input. It's very difficult to get a mouse that has relatively pure, unadulterated input at higher DPI, a lot of them will start interpolating or produce errors and there is no way of knowing except for feel. Rtings only tests error up to 1600 DPI, and how they even figure that out is a mystery.

Same reason I don't use 2k/4k/8k, even though my mouse supports it, it seems a lot of games and windows itself can't handle it very well. My guess is USB itself has issues with handling the stream of data coming out of your mouse if you play at high DPI and with a 8k polling rate.

Your pad also very much influences aim at high DPI, there is no way I can use a cloth pad or even ceramic/glass due to the errors it produces. Cloth also gets dirty extremely fast and is hard to maintain.

800DPI at 8k would be an extreme amount of duplicated data points. You probably wont notice any difference at all, let alone 2k. The mouse simply isn't grabbing enough data points for it to matter.

chandler
Posts: 130
Joined: 31 Aug 2016, 11:18

Re: 2000Hz vs 4000Hz

Post by chandler » 28 Feb 2024, 06:54

masneb wrote:
27 Feb 2024, 04:05
joseph_from_pilsen wrote:
18 Dec 2023, 21:29
The reason why hardcore gamers dont use highest dpi is simple, the ultra high settings of dpi will cause to your aim to be extremelly shaky. The sensor reads too many parasite moves at high dpi so the aim is shaky and not holding the track. I have been playing kinda long time with 3200dpi and 0.xx sensitivity and it was great for sniping (sniper rifles like scout or awp were extremelly responsive) but terrible for rifling, the crosshair was simply not holding altitude and was hard to control, due to parasite moves causing it to be extremelly sensitive to any micromoves. When i switched back to 800 dpi i felt like getting an aimbot. The only worse thing were less responsive snipas.
400 to 800 dpi is a good trade off, the crosshair stability is still good, and the mouse reacts noticably faster. 800 to any higher value like 1600 is not such a good trade, 3200 and more is already Parkinson simulator. Meanwhile mouse response lag decreases with diminishing returns, the amount of unwanted parasite moves added to your mouse movement increases exponentially with every dpi increment.
There is no such thing as 'parasite moves'. Higher the DPI, the more accurate your mouse becomes (assuming no interpolation or other shenanigins with the sensor or mouse firmware). That is your aim. Reducing DPI would make it so there is less input and your movement would become any less accurate. If the mouse is off, via input, the difference would be much greater with a lower DPI error compared to a higher DPI. If it's too sensitive, sensitivity could be turned down to compensate for that.

However, big caveats are that most games don't have proper support for decimal places for very high DPI/sens offsets and a lot of mice as well as setups do not handle lots of DPI input. It's very difficult to get a mouse that has relatively pure, unadulterated input at higher DPI, a lot of them will start interpolating or produce errors and there is no way of knowing except for feel. Rtings only tests error up to 1600 DPI, and how they even figure that out is a mystery.

Same reason I don't use 2k/4k/8k, even though my mouse supports it, it seems a lot of games and windows itself can't handle it very well. My guess is USB itself has issues with handling the stream of data coming out of your mouse if you play at high DPI and with a 8k polling rate.

Your pad also very much influences aim at high DPI, there is no way I can use a cloth pad or even ceramic/glass due to the errors it produces. Cloth also gets dirty extremely fast and is hard to maintain.

800DPI at 8k would be an extreme amount of duplicated data points. You probably wont notice any difference at all, let alone 2k. The mouse simply isn't grabbing enough data points for it to matter.
what do u mean by "2000hz isnt noticable on 800dpi" ?? I notice it on my desktop alone. (4k at 1600dpi too).

Post Reply