Eonds wrote: ↑24 Mar 2022, 06:59
our system is semi-stock. You're still on windows with a high standard deviation that cant measure sub ms differences concretely. You're not doubt competent to some degree so I wont be too harsh. The equipment which I'm talking about is hundreds of thousands of dollars which mostly companies like AMD & Intel & Nvidia use. I don't see the point of this post other than to look cool. It's true that most people have terrible systems. I think if you're not an actual expert on the topic you shouldn't be posting results that are inconclusive & inaccurate. Every BIOS (basically) has SMI's. They do cause latency spikes. This is !ONE! out of THOUSANDS of factors which you cannot change UNLESS you're HIGHLY knowledgeable. You'll only see what they want you to see. They don't care about you or anyone. A 1000 FPS camera is not good enough either. Do you think a company that has spent billions of dollars onto measuring latency would use a 1000 FPS camera to measure system latency ON WINDOWS? "Gaming Tweak Communities" says the drone shit posting on blurbusters with useless information and inaccurate results. Please don't talk shit to me, I'm trying to be constructive. In reality we both know why you made this post & it's cringe. Also gamemode itself has been changed over time through different windows version. You have hundreds of power saving features & hundreds of other useless clock gating/power gating things bogging down your GPU + thousands of other factors. It's hilarious because nothing of which I said was false & you still manage to turn into a twitter tweaker because I pointed out your false measurements. This is the problem with Science usually, EGO gets in the way. I don't care how you feel, I care about actual measurements. Leave your shitty attitude else where. Talk about adding nothing, look at your post dude....
I'll enlighten you, I'm not here to seek sub millisecond input latency differences so maybe this wasn't clear to you. But I'm under the impression that you do not entirely grasp the idea behind my tests. I was trying to see if there was any noticable or measurable difference in the normal millisecond range. (Considering the fact that I mentioned i was using a 960 fps camera, this should've been obvious to you, yet you didn't even grasp that.)
As for your claim that 'MSI Mode' option is 'designed' to reduce latency, and I assume you refer to my tests that include 'MSI Mode', obviously that test is referring to the registry modification of a certain PCI device and its modified operation. If it was solely created to lower latency, why do you think certain devices go haywire when forcing this registry addition causing blue screens or distortion of sound (Most soundcards i.e.). In fact, if a device is not using MSI by default, decided by the driver, the whole registry entry is absent unless you either add it yourself or use one of these tools like "MSI Util". Or do you also accept the fact that MSI mode was 'designed' to make systems unbootable or introduce other quirks?
If you are talking about Message signaled interrupts in general; It was most likely initially invented to get rid of the old physical interrupt lines and IRQ sharing. (Yes indirectly there will be latency reductions).
But the 'MSI Mode' you are referring to is a part of Windows's driver system to allow manufacturers or drivers to dictate whether they want to use native MSI mode or emulated legacy line-based interrupt method. Sufficed to say both go over the same PCIe bus as there are no physical IRQ lines anymore. And if your system is properly configured, no device will share interrupts. If it does (In emulated line-based interrupt mode), then yes in that case there might be a noticable improvement to be seen, but even this depends entirely on what other device is sharing the same IRQ in relation to execution times, PCIe LTR, etc.
You have zero knowledge about anything, stop wasting our time. Blocked.