- In Cyberpunk 2077, whenever the game autosaves, there's a micro-freeze, and the same happens when I take an in-game screenshot.
This should not happen: Writing a 4MB file (autosave) or 300KB image (screenshot) is nothing for modern I/O subsystems.
- In AC Valhalla, the framerate goes all over the place and drops multiple times close to 0 while it's clearly loading new textures / effects / environment (have a video demonstrating it).
Lookingt at default MSI (Message Signal Interrupts, not the company ): I/O controllers are by default set to "High" priority and have loads of low "IRQs" (from -2 to -37, then comes the NIC -39 to -63 and finally the graphics card at -65); and their limits are high too.
Unfortunately, tweaking those settings does not improve the situation, and can even cause the NVMe to stop working properly.
Any ideas or suggestions on how to prevent IO to completely interrupt the CPU ?
Additional note: This occurs in all games, be they CPU intensive or not, GPU intensive or not. Neither are overheating or being throttled in simple games, but the micro-stutters will occur.
Specs:
13900K with a Noctua NH D15, about 70° max package temp while gaming
Asus Z790 Hero
RTX4070 Dual at 1440p with a G-sync native screen
PS: Is there a way to post images ? As I have several screenshots illustrating the observations.
* For those interested, a non-exhaustive list of things and combinations I tried, where none solved the issue:
- BIOS: HT on/off, Resizebar on/off, E-Cores on/off etc.
- CPU: Affinity setting, underclocking, setting power limits...
- Windows 10 and 11: HAGS, VRR, Game mode, Power settings (disabling parking etc.)
- Nvidia: With/without frame limit, Vsync, GSync, low latency...
- Extensive HW testing of RAM, CPU, GPU via stress-testing, and stability (long runs) testing
- Completely fresh windows install, with only the drivers and one game