Cool, I'll try that again under those conditions then. Thank you!To be clear, you don't need to manually end all processes in the Task Manager, you just need to ensure you don't have other generally heavy windows/programs and/or background processes (in the taskbar) running, and only IF you're experiencing sync error messages in TestUFO. If not, you can leave whatever you have open, open.
It does indeed post. I've left it on for a maximum of 24-48 hours since I left it overnight sometimes running tests and benchmarks like MemTest86. I want to run stress tests like FurMark overnight, but that'd likely be disastrous if the temps got too high without me there to stop the test.If your system "posts" (aka starts up) and you can use it for hours on end without it BSODing, then it's probably fine. Severe enough static damage would likely prevent the system from starting.
That's good to know. I don't currently have Horizon, but I'll buy it to test that out then, especially since it's 67% off right now on Steam. Do any older games run with minimal stutter on PC? I thought that a game from 2013 (Arkham Origins) would work fine even at 4K 140-141 FPS, but perhaps I'm overestimating the 6800XT's capabilities. Should a 6800XT be able to run Arkham Origins and other games that are 9-10 years at 4K 140-141 FPS or at least 1440p 140-141 FPS upscaled to 4K with RSR or FSR? How far back do I have to go to get to games that have minimal stutter at 4K 140-141 FPS? I suppose it depends on how well-optimized the game is and that you can't go too far back though to stuff like Deus Ex in 1999 or F.E.A.R. from the early 2000s since those aren't optimized for modern systems. What is the sweet spot for minimum stutter then? Would stuff like MGSV from 2015 and Minecraft Bedrock work? I do want to test open world games since they're obviously giving me the most trouble, so open world games in general would be good.I don't know of any modern games with zero stutter on PC. Games with minimal stutter usually pre-compile shaders though, like Horizon Zero Dawn, for instance.
Neon White is from last year and works great for the most part, but that is not intensive at all, perhaps by design since it's meant for speedrunning. Valorant also has minimal stutter, though I have to wonder if Riot Vanguard is causing issues since it's kernel-level... also just anticheat programs in general such as BattleEye which I also have through Rainbow Six Siege. I'll try to uninstall Vanguard, Valorant, Siege, BattlEye, and any other anticheat services I have.
I do notice that Neon White, Valorant, and Siege as well as some other games drop to 139 FPS when something "new" happens. By "new," I mean something that I haven't seen yet in that particular play session. For example, it happens if I see a new enemy in Neon White or use a new gun card. If I replay that same level, it runs nearly flawlessly at a completely consistent 140 FPS. I wonder if this is related to the shader precompilation matter you mentioned. Since it's already compiled, maybe it works flawlessly the second time? I don't know though. That doesn't happen in every game, just a few games.
That's good to hear. Save times don't really matter anyway except in games with autosaves that take place during gameplay. If I can reduce my stuttering to just happen rarely and then happen every time an autosave happens, I'd be perfectly fine with that. If I see the autosave symbol, I can just stop playing for a few seconds. Additionally, that's an actual reason for stuttering to happen as opposed to who knows what.Just load times.
Yes, that is indeed the same as the fluctuating readout on my display. When my refresh rate changes, it also changes there to whatever the current refresh rate is. For example, I just opened Arkham Origins. For some reason, the intro sequence only displays at 30-40 FPS even though it displayed at 140 FPS before. If I open the OSD, it changes between 70-85 Hz. This intro sequence was also another reason I was concerned about FreeSync though. If the FPS drops to 30-40 FPS, shouldn't my refresh rate drop to 48Hz since I believe my FreeSync refresh rate range is 48-144 FPS according to my monitor's specs and AMD Adrenalin (if I remember correctly)? Or should it reset back to 144Hz? Instead, it stays in the 70-85 Hz range which just seems like a completely random number. Additionally, is the refresh rate displayed in the OS slightly behind? It sure seems like it's a second or two behind since, say, a drop in FPS to 130 is often accompanied by a change in my refresh rate to about 134 FPS two seconds later, but I don't know for sure.I found this capture of what is supposedly your monitor model's OSD menu. Is what is highlighted in green below the same as the fluctuating readout on your display, or is it separate?
Nice. I won't go through that process again then. Gigabyte's update process is notoriously buggy. It took me 2-3 hours to download OSD Sidekick and then to download the monitor firmware since the update kept failing for no reason... glad I don't have to do that again and infect my PC with bloatware I can't delete. It's good it's independent from my PC too. I'll use my terrible laptop then if any more firmware updates release in the future. It is terrible, but that's why I'm willing to break it even more haha.Monitor firmware versions aren't reverted when you do a clean install of windows or swap out PC hardware components since it's done directly on the monitor itself.
As for downgrading to the previous version, that's model-dependent, but 99% of the time the answer is no.
Thanks, I'll post there then! Frankly, I also have 0 interest in exploring this subject, but I'll do what I must. Maybe I'll learn more about how electricity works and all...If you go that route, post here:
Fair warning, I have 0 interest in exploring that subject.
I was wondering why consoles have more consistent performance even though they're weaker. That makes sense. I guess the modular nature of PCs has its disadvantages too. And it's interesting that PS5s can use SSDs better than PCs. Is that also because PS5s are a fixed platform while PCs are not? As for DirectStorage, will all SSDs be compatible with that or do I need an SSD specifically designed with DirectStorage in mind? I currently have a Samsung 970 Evo Pro 2TB and a Crucial P5 Plus 2TB. I'd imagine PCIE 3.0 SSDs would not be compatible, but perhaps PCIE 4.0 SSDs would? The P5 Plus was only $20 more including a heatsink even though it has almost double the read and write speeds and is PCIE 4.0 instead of 3.0, so I decided to get it. The only problem is that it runs hotter... indeed, I'm pretty sure I melted part of the damn thing after only four days, so that is an issue. I assumed the mobo's included M2 heatsink would be good enough, but I'll buy another P5 Plus and actually use the included heatsink to see if's a me problem or a Crucial problem.Yes, but there's typically (though not always) less because 1) consoles don't have shader-compilation stutter since they are able to automatically have all shaders pre-compiled as console are a fixed platform, whereas shader-compilation on PC is entirely dependent on what combination of hardware you have, so they have to be loaded on demand unless the devs implement pre-compilation (either on the title menu when you first launch the game, or asynchronously as you are playing) and 2), on PS5 specifically, unlike PC currently, it utilizes the full capability of higher SSD speeds to reduce asset streaming-related stutter.
PC is still stuck utilizing legacy IO HDD speeds, even on SSDs. This will hopefully be remedied soon by DirectStorage, but it remains to be seen how effective it will be.