power saving settings in windows 10

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
TYT
Posts: 11
Joined: 26 May 2022, 08:17

Re: power saving settings in windows 10

Post by TYT » 03 Aug 2022, 05:40

I understand your concern for your pc if you change dwm to prefer maximum performance this will not mess in OS and decrease overall stability of your PC But don't do that on a laptop I don't recommend disabling idle state nor disabling speedstep in Bios i made this post to help People who dont have expertise on doing these things I'm not trying to break their computers game mode in windows will not force your CPU to boost into its limits without compromising https://support.xbox.com/en-US/help/gam ... ming-on-pc am sorry if anybody is scared to try what im posting [They are all safe] :D
240Hz Omen X 25f nvidia 1660 super razer huntsman TE Logitech G PRO X SUPERLIGHT razer viper 8k

kokkatc
Posts: 108
Joined: 23 Mar 2017, 13:49

Re: power saving settings in windows 10

Post by kokkatc » 04 Aug 2022, 01:21

Thatweirdinputlag wrote:
02 Aug 2022, 18:01
This is becoming a bit extreme, things like disabling idle state, throttling or even forcing your GPU to run on high performance when the PC is doing nothing is just too much. Even things like disabling speedstep in Bios are becoming more of a religious act rather than an actual justified need. Turning on "Game Mode" on your PC will literally force your CPU to boost into its limits without compromising intel's limitations "unless MCE is turned on". Please do your research on every feature and ask yourself whether you really need that turned off or not.

Don't get me wrong, your system can mostly handle that, when it's properly configured. Most of the people here are just lost inexperienced souls looking for answers on how their favorite game is performing differently than what they see elsewhere. Motherboard manufacturers's most of the time tend to over-volt everything to try their best in giving the end user a plug and play experience free of troubleshooting. That said, they also include all the safety features like Speedstep, thermal monitor, etc. A recipe for hardware failure will be highly probable when the end user just disables those features without knowing that his motherboard was running on intel's fail safe values shoving close to 1.5v in their CPUs.

If you are someone that has somewhat of an old rig and you are trying to squeeze every last drop of performance you can out of it, then sure, go for it! Not without doing your research though, a slow performing rig is still better than a broken one.
Okay, it's become clear that you may misunderstand how powersaving features were designed and intended to operate. Let's take speedstep for example since you mentioned it.

Speedstep was designed to throttle your cpu on the fly by changing frequency and voltage by controlling the system's p-states. Every time the frequency and/or voltage is dynamialy changed, you incur a latency penalty. This is a significant latency hazard and you can easily identify the increase in latency by using latency monitor by resplendence, the premiere dpc latency tool mind you. Let's also take into account that latency monitor RECOMMENDS you disable speedstep for accurate doc latency results because speedstep absolutely cripples latency. Speedstep in no way should ever be enabled if your care about performing and latency. Typically competition gamers care about these things. Casual gamers probably wouldn't care nor would they likely identify the differences whether it's enabled or not. To me its night and day, and to any competitive gamer really would easily identify the benefits.

Now let's talk about cpu throttling and cpu idle OS settings. The benefit to disabling these is to eliminate any dynamic on the fly throttling that results in latency penalties. There's a reason why latency goes down significantly when these are disabled. Is this for everyone? No. Is it for a serious or competitive gamer in say a competitive fps? Absolutely. The trade off is your computer will require more power to run but it will be more responsive w/ lower latency.

Now suggesting this is extreme I'd also say is a bit dramatic. Most people that prefer latency over power savings will almost always prefer lower latency even if it means more power to accomplish this.

I must also mention server builds as an example to why these settings even exist in the first place. Let's take a lenovo server as an example. Depending on the application use, low latency may be a requirement for it to operate as intended. Lenovo server designers understand this requirement and they also know their servers need to be tuned properly for low latency applications. In their own guides it is clearly stated that speedstep, c-states, etc should all be disabled for low latency / maximum performance if the application/service/operation requires it. There are countless professional low latency use cases.
Seever designers understand this so they provided guides to their own systems that employ the same technologies like speedstep/cstates/etc. These professionally written guides also typically give you two options for which setting to use (maximum efficiency or maximum performance / latency).

I get that what may seem extreme to you may actually be common practice in reality. These are home pc's we're talking about, and they're all designed to save power by default. A competitive fps game is a low latency application and you have to tune your PC to get the best results. This is the reality of the matter.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: power saving settings in windows 10

Post by Chief Blur Buster » 04 Aug 2022, 04:40

Thatweirdinputlag wrote:
02 Aug 2022, 18:01
This is becoming a bit extreme, things like disabling idle state, throttling or even forcing your GPU to run on high performance when the PC is doing nothing is just too much. Even things like disabling speedstep in Bios are becoming more of a religious act rather than an actual justified need. Turning on "Game Mode" on your PC will literally force your CPU to boost into its limits without compromising intel's limitations "unless MCE is turned on". Please do your research on every feature and ask yourself whether you really need that turned off or not.
Game Mode is an excellent move, but sometimes we need to go beyond, because Microsoft never tested 500Hz.

Unfortunately this tweak isn't exactly extreme... if you own a 360Hz, 390Hz or a prototype 500Hz monitor, combined simultaneously with the Razer 8Khz mouse, combined simultaneously with strobing, combined simultaneously with certain non-VSYNC-OFF sync technologies, combined with a game (e.g. Quake) that is able to keep up with refresh rate. The perfect storm.

At 60Hz, 1ms was unimportant, but at 360Hz+, 1ms has actually become human visible. The refresh rate race removed a lot of links that made ever-tinier milliseconds even more visible. In some cases, a 0.5ms stutter or a 0.5ms motion blur is human-visible (if your MPRT is less than stutter error margin in milliseconds). For example, a future prototype 4K 240Hz strobed display (0.3ms MPRT), actually can make quite a lot of sub-millisecond jitter human-visible whereas it used to not be.

It's part of the pushing-envelope of the Vicious Cycle Effect;

Power Management has created a lot of major problems (FreeSync stuttering and capped stuttering), because the idle states sometimes take more than 0.5ms-1ms to wake up on some motherboards when you're not in High Performance.

Please see The Amazing Human Visible Feats Of The Millisecond

I am sorry to be the messenger that this is actually far more legitimate (more common) than the voodoo EMI stuff (much rarer). Even though, we are kind of tired of the EMI forums (Especially forum members who ONLY sign up ONLY for the EMI forum, even though that issue is legitimate in many cases), this type of "force high power" tweak solves much more common issues we've discovered in the refresh rate race (e.g. 8000Hz mouse + 360Hz monitor + etc) when you've already whac-a-mole'd a lot of weak links, and now running into problems (e.g. VRR microstuttering that isn't the game engine fault) that was recently traced to flawed power management sheninigans in drivers, etc.

We are running headstrong into a lot of weak links (even VSYNC OFF -- due to its microjitter issues -- needs to become obsolete because it partially masks 240Hz-vs-500Hz differences) -- we need lagless non-VSYNC-OFF technologies to become esports-popular in the next 5-10 years in our 1000Hz future.

The refresh rate race is creating perfect storms where weak links (LCD GtG, power management, etc) creates human-visible 1ms microjitters under certain refresh-rate-race perfect storm conditions.

This tweak, also, however, is a very double edged sword, though -- it can also worsen things on some computers. In a computer, especially overclocked, enabling High Performance can trigger thermal throttling microjitters worse than power management microjitters. In this case, this becomes a microjitter pick-poison effect.

The bottom line is as (Hz=increasing) AND (synctech=clearer motion quality than VSYNC OFF) AND (MPRT=supwer low) AND (game=keeping up) AND (display=getting closer to retina rez), creating the necessary perfect storm for ever-tinier milliseconds becoming above the human-visibility noisefloor, because other sources of jitter was eliminated. Scientifcally, this is a serious matter in the current refresh rate race towards retina refresh rates. As the researchers recently published a peer reviewed paper showing Razer 8KHz is a legit humasn-visible improvement, we have to call plausible here, if you've already whac-a-mole'd other higher-priority micojitter weak links.

(Also, keep in mind sometimes improvement is via microjitter, not via lag -- anything human-visibly-beneficial is fair game here, and that is my POV -- if it's lag-worthless but jitter-improvement-worthy, that doesn't dismiss the tweak).

Depending on how far you've pushed your high end gaming rig, this is (on some systems) more common than rarer causes of microjitter (e.g. EMI).

240Hz-vs-360Hz is invisible (1.1x difference instead of 1.5x) largely because of weak links that also includes microjitter (high-frequency microjitter is additional motion blur above-and-beyond simple perfect square-stairstep sample and hold mathematics -- like a fast-vibrating guitar string blending to blur, microjitter/microstutter that vibrates 70 times per second is additional motion blur. You can see this effect at www.testufo.com/eyetracking#speed=-1 (SEE FOR YOURSELF!)... watch the bottom UFO for at least 20 seconds to understand this effect. This is applicable for motion that microjitters 70-200 cycles per second at 500fps 500Hz, diminishing 240Hz-vs-500Hz comparisions, above-and-beyond slowness of LCD GtG)

💡 DID YOU KNOW? 4ms stutters are invisible in 16.7ms MPRT of blur (60Hz), but 4ms stutters are human visible at 2ms MPRT (strobed display or recent 500Hz prototype display!). The thickness of motion blur is no longer enough to hide the stutter vibration amplitude.\

💡 DID YOU KNOW? 1ms stutter error = 4 pixel offset at 4000 pixels/sec, which is only one screenwidth per second on a 4K monitor. This jump can be visible with low MPRTs (e.g. strobed 0.5ms MPRT at 4000 pixels/sec = 2 pixels of motion blur.

Power management effects can exceed 1ms since Windows is not always fast at waking up CPUs on all motherboards on all implementations -- even a 20% slowdown of CPU cycles in an 8KHz mouse driver can peg a core that creates an issue. A stationary mouse can cause the CPU to speedstep down, then you suddenly move mouse, 8000 polls per second, it overloads a CPU core (only 5000 polls processed at 100% CPU in the mouse driver), until Intel speedsteps it up, and Intel won't speedstep it in 1/8000sec, interfering with your mouse, and thus screwing up your mouse, since mouse drivers are CPU-bound, and 8000 polls a second pushes the limits of one core of your CPU. Research has shown that even 1 missed poll (1/8000sec) shows as a human-visible missing-mouse-pointer gap in its phantom array effect, and this form of microjitter can be accidentally perceivd/interpreted by some humans as as 'lag'. This is yet another famous BlurBusteresque Millisecond Matters example, that is far more relevant and scientifically measurable than random voodoo one-off EMI situations;

Related, see Example Blur Busters Success Story: Incubating Research on Mouse 1000Hz vs 8000Hz, and it showed an excellent example of human-visible submillisecond:

Image

Observe that in the 1000 and 2000 rows, the gap-variances are submillisecond!

And, this is applicable to mouseturns/pans in games too (they stroboscopic too, like a mouse cursor, e.g. fixed gaze on crosshairs while the background stroboscopics past your fixed view, like The Stroboscopic Effect of Finite Frame Rates, an article apparently widely spread internally at NVIDIA)

Now, the pollrate of 8000 pegs one of my CPU cores if my core is downclocked a bit, and in this case 8000Hz looks worse than 1000Hz in some software, until I lower it to 2000-4000Hz, or if I force the CPU/GPU to never downclock. Since the clockspeed changes will lag behind somewhat (by more than 1/8000sec), this is enough time to create microjitter havoc if you're unusually sensitive to jitter. (Everybody is sensitive to different things. Tearing or Stutter or Blur....sometimes all of the above!)

Unintended consequences, my ***...

<FuturamaBenderBiteMyAss.mp3>

Myself as a well-credited display researcher, being referenced in over 25 scientific peer reviewed papers (www.blurbusters.com/area51 and www.blurbusters.com/research-papers) -- Blur Buster already far beyond the "Human Cant See 30fps-vs-60fps" debate, famously incubates/convinces researchers to study new outlier areas that has gained new legitimacy due to ever-raising-Hz lifting the veil of formerly-invisible effects.

Also, when I testing raster-interrupt-style beam-racing with NVIDIA/AMD graphics cards (successfully -- Tearline Jedi), even 10 microseconds moved a tearline down 1 pixel. This technology was adopted by RTSS Scanline Sync and Specal-K Latent Sync -- sync technologies far superior to VSYNC OFF if you can sustain framerate=Hz at 50%-or-less GPU. But these sync technologies that are super-sensitive to power management settings. Using this specific Full Performance tweak is very useful for improved reliability by RTSS Scanline Sync users!

Also, different sync technologies with different microjitter timing sensitivities, are always continuously being invented, and some systems are having more FreeSync stutter problems (less of a problem with G-SYNC native chip, since FreeSync is more software/driver driven) when running in Balanced Mode. Game Mode often fixes it, but this is a perfect example of how power management really screws around things, already, even at 144Hz. But now we're at 500Hz consumer monitors hitting the market soon, and Game Mode doesn't quite whac-a-mole all weak links completely at that stratosphere...

This tweak is mostly worthless if you're on a 60Hz monitor, though

Most users should try to stick to Game Mode, and do their homework, their research, and not try to test unknown tweaks, but this is definitely not a "placebo tweak" in certain perfect-storm refresh rate race situations, especially on many implementations of motherboards/CPU/driver combinations that don't have microsecond-accurate wakeup abilities.

Tweak Benefits Fact Check:
Scientifically Plausible due to refresh rate race side effects, with known caveats
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Thatweirdinputlag
Posts: 305
Joined: 27 Aug 2021, 14:09

Re: power saving settings in windows 10

Post by Thatweirdinputlag » 05 Aug 2022, 04:43

kokkatc wrote:
04 Aug 2022, 01:21
Chief Blur Buster wrote:
04 Aug 2022, 04:40
There's literally nothing dramatic about being cautious. Hence my statement " Your PC can handle it, if configured correctly". I never said I'm against those things, rather than saying those features are a bit extreme and further research might tell the user if they actually need them or not. Your average joe here doesn't know how to run and maintain a server station let alone own a 500hz monitor and knows how a missed poll out of 8000 is practically visible.

People take shortcuts, they love doing so! What mostly triggered me is calling readers to switch thermal monitor off, speedstep, and disable process idle states. If you can't see how this is problematic on a poorly configured system under the hands of an unknowledgeable user that doesn't even own any form of hardware monitoring software then I don't know what is.

Speedstep for example would only induce latency if it shifts processor speed, which is something that will never happen when Game Mode is activated and intel's default limitations aren't compromised. Some games might trigger AVX offset, but it's better to adjust that than turning off speedstep completely. Speedstep will only bring your CPU up to its max non-turbo frequency boost and not its full boost "with intel turbo boost 2.0 & 3.0". Even with Speedstep being Off Your cpu might still down clock during a gaming session if "Game mode" was off and you're running on Balanced Power Mode if your CPU decides that there's not enough load to justify staying at full rated turbo boost frequency. To avoid that, just switch to high performance power plan setting, or adjust your minimum processor state to 100%.

Disabling Process Idle States "not C States in bios" will force your CPU to run on C0 states at all times, yes it will reduce latency related to cores switching states while gaming loads fluctuate, but would also unjustifiably increase the stress on your entire system while it's on idle. On my processor for example, it rests at about 10 watts and 28C temp 0.7V, if I disable process idle states, it will be sitting approximately at 75-80watts, 52-54C and 1.25V ish "I've set a generous LLC and hyper-threading is disabled". On a poorly configured system, where BIOS is mostly running on Auto, you're looking at even higher temps, voltages considering lighter LLC levels. Turn thermal monitor off and watch your system fail. It's exactly why I encourage visitors to do their own research on every feature, I never said tweaking things never yields any benefits. Besides, not everyone is willing to sacrifice a big chunk out of their system's life just to gain a few percent more performance. Surely there are other non-intrusive ways to manage a slight performance increase. Thing's like "Game Mode" and having shortcut icons on your desktop that trigger High Performance Power mode and a Balanced one on the fly.

Things like DWM.exe or Explorer.exe and adjusting their Nvidia power settings to maximum performance have absolutely no research done behind them, not one that i've come across at least when I looked them up. It's like asking someone to stand next to a chair while waiting for something instead of sitting, they can surely stand, but there's absolutely no reason to do so lol.

Summing things up; Please do your own research. There's nothing I enjoy reading more than a well put post that has context and reasoning, where it's merits are applicable with full disclaimers as why it's needed or not. Which is something that obviously some people fail to understand in here. My whole point was to encourage tweaking with your system's longevity as a priority. Just like I'm able to run my chip at 5.4ghz with 1.44V while it's scourging hot at 92C on moderate gaming loads doesn't mean that the extra 2-3% performance gains are worth the considerable amount of extra stress on my system.
Rog Strix Z79i - Intel 13700K - 4090 OC ROG Strix - 7200 Trident G.Skill - 1TB SK Hynix Platinum P41 - 1000W ATX3.0 Asus Tuf - 34'' Odyssey OLED G8 - FinalMouse Tenz S/Pulsar Xlite V2 Mini - Wooting 60HE - Sennheiser HD 560s - Shure SM7b - GoXLR Mini

User avatar
BTRY B 529th FA BN
Posts: 524
Joined: 18 Dec 2013, 13:28

Re: power saving settings in windows 10

Post by BTRY B 529th FA BN » 05 Aug 2022, 07:18

That's the great thing about choosing a power plan. I personally have short-cuts to all my power-plans on desktop. When gaming I select the plan that I've configured with all power saving features disabled.

DPRTMELR
Posts: 165
Joined: 12 Apr 2022, 13:42

Re: power saving settings in windows 10

Post by DPRTMELR » 06 Aug 2022, 13:28

Sorry for a question that goes against the spirit of the forum:

I had power settings all unthrottled on bios, but played for about a week without realizing windows power plan wasn't set up properly. It was "good enough" that I did not realize it was being capped. Does power settings still matter today in 2022 if I were to never alt tab out of a game? I mean the windows power plan specifically because there's actually some bios setting for "max performance voltage up time" or something similar so bios settings still does a lot.
Most adults need 7-8 hours of sleep each night. - US FDA

Post Reply