Comparing bare-bones Win7 vs 10 install
Posted: 02 Feb 2019, 20:41
Been running two barebone Windows installations for a while now and doing all kinds of testing.
Stripping then off useless junk and/or disabling whatever possible can be disabled without affecting said games. (just for sake of consistency and no useless background stuff going on)
I can sense the difference anywhere from 5 to 10ms input lag (Either that, or mouse smoothing going on) when tracking in-game, and so far Windows 7 always seemed to have come on top when it comes to input latency. Now for flicks and trigger reactions, clearly (at least I) cannot tell the diff between such a small number. But when it comes to tracking in games and keep a laser pinpoint on a specific pixel area whilst moving, the delay becomes quite prevalent.
All testing is consistently done with in-game cap/fullscreen/and G-sync without V-sync (I dont mind the small tearing, and I know why I prefer it as the frames above the tear get shown on screen a tad faster this way, the Cryengine game i play has quite consistent fps cap, and can be further tweaked by changing timer from 1 to 0.5ms, or vice versa)
Frametimes on both OS are buttersmooth throughout the whole parcour I run each and every time, same every time.
(8-12ms consistent for a few minutes), eliminated all stutter and shader precompiling.
Some things I've tested (currently 1809 LTSC, only essential services running, pretty much all drivers MS in-box), which do not affect it:
- FSO (Full screen optimization), verified and works. Should not actually have any impact vs 'real' fullscreen.
- Game mode, can't feel a difference. No clue whether the game I test with atm actually benefits from it or not. I'm sceptical about this one to begin with, how can a mechanism reduce input latency that much, unless its to compensate for all the background junk going on in the majority of Windows 10 SKU's with their default installation? Much is unclear in this aspect.
- Same Nvidia driver which I found least problematic on either OS with the majority of my games, but I did change 2 branches, no effect.
- DisableDynamicTick, assuming this works at all, would probably affect jitter more than latency but no idea.
Any other Microsoft fluff like GameDVR is non-existent in these custom builds I use (The injection overlay anyway), Focus assist, and you name it. Just as close to old skool W7 as you can possible make it.
I know MS changes up stuff all the time, even the system timers keep getting changed continuously, I'll have to try 1607 LTSB again and see how that fairs. Any ideas? Sadly I have no millisecond accurate hardware to capture and I dont think my 240fps camera is accurate enough.
Gotta try:
- Interrupt steering
- Fidling around with certain HID devices, but other than my mouse and keyboard all my other peripherals are on a different USB controller which is disabled by default.
I'll have to figure out other stuff to test later.
Both OS report roughly 2 to 20us DPC latency on max core clock on IDLE.
Stripping then off useless junk and/or disabling whatever possible can be disabled without affecting said games. (just for sake of consistency and no useless background stuff going on)
I can sense the difference anywhere from 5 to 10ms input lag (Either that, or mouse smoothing going on) when tracking in-game, and so far Windows 7 always seemed to have come on top when it comes to input latency. Now for flicks and trigger reactions, clearly (at least I) cannot tell the diff between such a small number. But when it comes to tracking in games and keep a laser pinpoint on a specific pixel area whilst moving, the delay becomes quite prevalent.
All testing is consistently done with in-game cap/fullscreen/and G-sync without V-sync (I dont mind the small tearing, and I know why I prefer it as the frames above the tear get shown on screen a tad faster this way, the Cryengine game i play has quite consistent fps cap, and can be further tweaked by changing timer from 1 to 0.5ms, or vice versa)
Frametimes on both OS are buttersmooth throughout the whole parcour I run each and every time, same every time.
(8-12ms consistent for a few minutes), eliminated all stutter and shader precompiling.
Some things I've tested (currently 1809 LTSC, only essential services running, pretty much all drivers MS in-box), which do not affect it:
- FSO (Full screen optimization), verified and works. Should not actually have any impact vs 'real' fullscreen.
- Game mode, can't feel a difference. No clue whether the game I test with atm actually benefits from it or not. I'm sceptical about this one to begin with, how can a mechanism reduce input latency that much, unless its to compensate for all the background junk going on in the majority of Windows 10 SKU's with their default installation? Much is unclear in this aspect.
- Same Nvidia driver which I found least problematic on either OS with the majority of my games, but I did change 2 branches, no effect.
- DisableDynamicTick, assuming this works at all, would probably affect jitter more than latency but no idea.
Any other Microsoft fluff like GameDVR is non-existent in these custom builds I use (The injection overlay anyway), Focus assist, and you name it. Just as close to old skool W7 as you can possible make it.
I know MS changes up stuff all the time, even the system timers keep getting changed continuously, I'll have to try 1607 LTSB again and see how that fairs. Any ideas? Sadly I have no millisecond accurate hardware to capture and I dont think my 240fps camera is accurate enough.
Gotta try:
- Interrupt steering
- Fidling around with certain HID devices, but other than my mouse and keyboard all my other peripherals are on a different USB controller which is disabled by default.
I'll have to figure out other stuff to test later.
Both OS report roughly 2 to 20us DPC latency on max core clock on IDLE.