Blur Busters Forums

Who you gonna call? The Blur Busters! For Everything Better Than 60Hz™ Skip to content

Comparing bare-bones Win7 vs 10 install

Everything about input lag. Tips, testing methods, mouse lag, display lag, game engine lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.

Comparing bare-bones Win7 vs 10 install

Postby MT_ » 02 Feb 2019, 20:41

Been running two barebone Windows installations for a while now and doing all kinds of testing.

Stripping then off useless junk and/or disabling whatever possible can be disabled without affecting said games. (just for sake of consistency and no useless background stuff going on)

I can sense the difference anywhere from 5 to 10ms input lag (Either that, or mouse smoothing going on) when tracking in-game, and so far Windows 7 always seemed to have come on top when it comes to input latency. Now for flicks and trigger reactions, clearly (at least I) cannot tell the diff between such a small number. But when it comes to tracking in games and keep a laser pinpoint on a specific pixel area whilst moving, the delay becomes quite prevalent.

All testing is consistently done with in-game cap/fullscreen/and G-sync without V-sync (I dont mind the small tearing, and I know why I prefer it as the frames above the tear get shown on screen a tad faster this way, the Cryengine game i play has quite consistent fps cap, and can be further tweaked by changing timer from 1 to 0.5ms, or vice versa)

Frametimes on both OS are buttersmooth throughout the whole parcour I run each and every time, same every time.
(8-12ms consistent for a few minutes), eliminated all stutter and shader precompiling.

Some things I've tested (currently 1809 LTSC, only essential services running, pretty much all drivers MS in-box), which do not affect it:

- FSO (Full screen optimization), verified and works. Should not actually have any impact vs 'real' fullscreen.
- Game mode, can't feel a difference. No clue whether the game I test with atm actually benefits from it or not. I'm sceptical about this one to begin with, how can a mechanism reduce input latency that much, unless its to compensate for all the background junk going on in the majority of Windows 10 SKU's with their default installation? Much is unclear in this aspect.
- Same Nvidia driver which I found least problematic on either OS with the majority of my games, but I did change 2 branches, no effect.
- DisableDynamicTick, assuming this works at all, would probably affect jitter more than latency but no idea.

Any other Microsoft fluff like GameDVR is non-existent in these custom builds I use (The injection overlay anyway), Focus assist, and you name it. Just as close to old skool W7 as you can possible make it.

I know MS changes up stuff all the time, even the system timers keep getting changed continuously, I'll have to try 1607 LTSB again and see how that fairs. Any ideas? Sadly I have no millisecond accurate hardware to capture and I dont think my 240fps camera is accurate enough.

Gotta try:
- Interrupt steering
- Fidling around with certain HID devices, but other than my mouse and keyboard all my other peripherals are on a different USB controller which is disabled by default.

I'll have to figure out other stuff to test later.

Both OS report roughly 2 to 20us DPC latency on max core clock on IDLE.
MT_
 
Posts: 42
Joined: 17 Jan 2017, 15:39

Re: Comparing bare-bones Win7 vs 10 install

Postby i fast reaction » 03 Feb 2019, 21:23

barebones win7 is more responsive than even a stripped 1809 LTSC


win10 1809 currently has some stupid new timer functionality, here is what you should do to fix it + responsiveness. might worsen latency on ryzen:

hpet disabled in bios
bcdedit /set useplatformclock true
bcdedit /set disabledynamictick true
bcdedit /set tscsyncpolicy Legacy you can try enhanced too if you want

forces use of CPU's timestamp counter

/// might not matter with useplatformclock on (or at all anyway), this is what my setup is anyway:

bcdedit /set x2apicpolicy Disable
bcdedit /set use legacyapicmode Yes
disabled programmable interrupt controller in devmgmt.msc


1809 seems to default to x2apic instead of what was being used in the past (lapic)

interestingly disabling x2apic and enabling legacy apic with the above commands with useplatformclock off will still use x2apic



i've never ever ever ever gotten any install of windows 10 to be more responsive than 7.


pm me if you want to join my discord with a few friends where we talk about this stuff if you want
i fast reaction
 
Posts: 6
Joined: 25 Jan 2019, 03:27

Re: Comparing bare-bones Win7 vs 10 install

Postby MatrixQW » 04 Feb 2019, 06:03

https://forums.blurbusters.com/viewtopic.php?f=10&t=4936
Give it a try with both systems if you don't have equipment.

You have to differentiate performance from input lag.
Most things you touch won't affect input lag, drivers can, especially video card.
Doing tests by feel can be misleading, placebo is dangerous and can drive you insane.
MatrixQW
 
Posts: 99
Joined: 07 Jan 2019, 10:01

Re: Comparing bare-bones Win7 vs 10 install

Postby i fast reaction » 04 Feb 2019, 07:44

MatrixQW wrote:https://forums.blurbusters.com/viewtopic.php?f=10&t=4936
Give it a try with both systems if you don't have equipment.

You have to differentiate performance from input lag.
Most things you touch won't affect input lag, drivers can, especially video card.
Doing tests by feel can be misleading, placebo is dangerous and can drive you insane.



I'd like to stop you here as I'm rather sick of the flood of tests for input lag by first-on-screen-reaction. While the number of milliseconds the original poster claimed he felt between the operating systems is rather high and would easily be picked up by this type of test on a high refresh rate monitor, I'd like to address this reply nonetheless.

This is not a valid way of testing for system or i/o latency.
This is not a valid way of testing for this type of system or i/o latency.
This is not a valid way of testing for this type of system or i/o latency.

This is only good for testing exactly what the test is for: seeing how long it takes for one single input to be visible through an entire chain of input AND output at a SINGLE POINT in time. Yes, the test can be repeated for more data points, but that still does not make it accurate to what we feel during system usage.

Our own hands and mind are more accurate in feeling these things than testing a single input appearing on a screen one time. Much of what we describe as "input lag" is other traffic on the system/processor pushing our device interrupts to the system around, resulting in our human movements not being accurately represented in-game in as close to "real time" as possible.

----

The accurate way to test for what the original poster and I are "feeling" would be to set up a system that can accurately send input polls at a static frequency OVER TIME and record the screen with a high speed camera while having something in the system set up as a "line zero" to compare against.

That was a butchered explanation, so I will lay out an overthought idea in better detail:

- A low latency game engine set up with a blank map with nothing but vertical lines spaced apart surrounding the "player"/first person camera. These lines would be placed where the input should line up in a zero latency system according to the imitated mouse speed of the device below.

- An input device set to imitate mouse movement along a flat axis at a static speed polls the computer at a set frequency.

- A high speed camera records the screen (preferably a very high refresh rate display for more accurate data - 480hz, resolution may be an issue).

- The frames from the recording are analyzed and system latency is determined by how far behind the crosshair/line/1stperson view's center point becomes over time from the line where it should be as it spins around or by measuring the amount of time it takes to fall one line back.


This is a (probably more complex than necessary) way of testing for system latency externally. The design could most certainly be simplified. Other factors would have to be taken into account, one example: memory timings/sub-timings in the BIOS should all be set manually and settings to prevent memory retraining on boot should be enabled.

The goal would be to have an accurate way to reproduce input and measure the effects of changing a setting over time as this is the latency we feel. Not one singular input appearing once on a display that has many milliseconds of latency anyway. The display should be as little a factor as possible in testing for system latency, only a means to view the latency graphically.

----

This is not placebo. Please do not pass these things off as placebo where it's obvious a person has gone to the effort to create two test environments and can very well feel the difference between the two. An operating system and the software environment a person plays a game in is just as important to a competitive player as the mouse and mouse surface they use. Saying a change to this environment is placebo is akin to telling a figure skater skating with different blades that any change they feel between the two pairs is "just placebo".


.


System latency in modern operating systems as a result of more background services running, probable changes to the OS core and whatever else that cannot be disabled will drive me insane. Not placebo. I do not believe in placebo where a system can be dissected and analyzed. Even if we do not have the tools to measure what we feel, we can still understand what the changes we make to our computers do and how they likely affect device input or other latency.


Thanks. I may edit this slightly. It's messy, sorry.
i fast reaction
 
Posts: 6
Joined: 25 Jan 2019, 03:27

Re: Comparing bare-bones Win7 vs 10 install

Postby MT_ » 04 Feb 2019, 09:10

i fast reaction wrote:barebones win7 is more responsive than even a stripped 1809 LTSC


win10 1809 currently has some stupid new timer functionality, here is what you should do to fix it + responsiveness. might worsen latency on ryzen:

hpet disabled in bios
bcdedit /set useplatformclock true
bcdedit /set disabledynamictick true
bcdedit /set tscsyncpolicy Legacy you can try enhanced too if you want

forces use of CPU's timestamp counter

/// might not matter with useplatformclock on (or at all anyway), this is what my setup is anyway:

bcdedit /set x2apicpolicy Disable
bcdedit /set use legacyapicmode Yes
disabled programmable interrupt controller in devmgmt.msc


1809 seems to default to x2apic instead of what was being used in the past (lapic)

interestingly disabling x2apic and enabling legacy apic with the above commands with useplatformclock off will still use x2apic



i've never ever ever ever gotten any install of windows 10 to be more responsive than 7.


pm me if you want to join my discord with a few friends where we talk about this stuff if you want


Thx! Might do!

Ya my prebuilt iso's actually have those bcdedit entries by default, not really sure if it matters. Havent tried the apic stuff!

HPET, like many newer (gaming) mainboards, can no longer forcibly disable HPET in bios (at least not my Z270G gaming) altho I think W10 uses Invariant TSC by default. Last time forcing HPET put my windows slow to a crawl, i think hpet is bugged on kaby lake platform to begin with.

Tried LTSB 2016 (again -.-) and it comes close, yet not quite. Perhaps im just looking at differences in mouse input in both OS, and the actual rendering chain is just about the same. in latency.

One thing is for sure, pretty much all CPU bound games ive tested always outdid 10 in max fps. 1-5% usually but secretly hoped this was with reason (cpu reservation or otherwise) to guarantee something.

The only reason I want to run 10 to begin with is security for the next 10 years on long term branch, and have 1 OS for all, as some crap starting to become exclusive/dx12.

Even got a slimmed build with store working, and 'crap' services only ignite when starting store. Doesnt matter tho for lag. It has nothing to do with background services, well not the stuff im after. Its way closer to the core of the OS.

i cant live on a OS which has objectively (to me) worse mouse tracking and more 'junk' in between my mouse and the actual visual response.

I dont believe what im after is to any relevance for a csgo twitch response player and possibly humane reaction times, but tracking is a whole different beast altogether.

Sorry for the ramble, it just had to get out.
MT_
 
Posts: 42
Joined: 17 Jan 2017, 15:39

Re: Comparing bare-bones Win7 vs 10 install

Postby MatrixQW » 04 Feb 2019, 12:39

For how you should do the input lag test, i will leave that answer to Chief or someone else.

i fast reaction wrote:This is not placebo. Please do not pass these things off as placebo where it's obvious a person has gone to the effort to create two test environments and can very well feel the difference between the two. An operating system and the software environment a person plays a game in is just as important to a competitive player as the mouse and mouse surface they use. Saying a change to this environment is placebo is akin to telling a figure skater skating with different blades that any change they feel between the two pairs is "just placebo".

I didn't mean there isn't a difference between the two systems. I do not know.
How do you know it's the system and not hardware or driver related?
I was refering to changes made to the system.

When people start disabling things, applying tweaks and changing settings without understanding what they are for or how they work, usually get to wrong impressions and conclusions thinking it got better when it does nothing or actually make things worse.
I said it can be misleading, not that it actually is.
When someone believes a certain game setting improves things, a blind test will show if it's placebo or not.
9 correct of 10 attempts shows someone is right.
MatrixQW
 
Posts: 99
Joined: 07 Jan 2019, 10:01

Re: Comparing bare-bones Win7 vs 10 install

Postby Chief Blur Buster » 04 Feb 2019, 19:55

MatrixQW wrote:When people start disabling things, applying tweaks and changing settings without understanding what they are for or how they work, usually get to wrong impressions and conclusions thinking it got better when it does nothing or actually make things worse.

It is indeed a big risk. So that's why I wanna see yummy, delicious data.

If anyone is using RTSS or other statistics-recorder to track the effects in a controlled manner, I'd love to see graphs with repeated A/B tests (A->B->A->B, and/or blind tests) -- seeing things like "increases in 0.1% frametimes" to "increasese in standard deviations of framerate fluctuation" or other variables that noticeably changes reliably with the toggling ON/OFF of various settings. If anyone is willing to do these kinds of tests in such depth, I'm interested in poring over those charts.

Sometimes single-millisecond effects have a way of reliably cascading into something bigger (a simplified version is those "0.5ms misses of VSYNC" cascading into a 16.7ms extsra frametime during VSYNC ON). There's still many under-researched weak links being revealed in the Refresh Rate Race to Retina Refresh Rates.

Blur Busters welcomes such exploring of what the weak links are. More surprises may be lurking under the hood that needs to be experimentally revealed as signal above the noise. Nintey percent of the time it might be indeed placebo, then the remaining ten percent surprises unexpectedly as yet another "Blur Busters Millisecond Surprise".

Keep exploring,
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 5839
Joined: 05 Dec 2013, 15:44

Re: Comparing bare-bones Win7 vs 10 install

Postby Notty_PT » 04 Feb 2019, 22:02

Well I always knew Mouse response/input is totally different in Windows 7 and 8.1 (not only 7) compared to 10. But never imagined 10 had more latency by itself, Interesting :)
Notty_PT
 
Posts: 297
Joined: 09 Aug 2017, 02:50

Re: Comparing bare-bones Win7 vs 10 install

Postby MT_ » 06 Feb 2019, 09:01

MatrixQW wrote:For how you should do the input lag test, i will leave that answer to Chief or someone else.

i fast reaction wrote:This is not placebo. Please do not pass these things off as placebo where it's obvious a person has gone to the effort to create two test environments and can very well feel the difference between the two. An operating system and the software environment a person plays a game in is just as important to a competitive player as the mouse and mouse surface they use. Saying a change to this environment is placebo is akin to telling a figure skater skating with different blades that any change they feel between the two pairs is "just placebo".

I didn't mean there isn't a difference between the two systems. I do not know.
How do you know it's the system and not hardware or driver related?
I was refering to changes made to the system.

When people start disabling things, applying tweaks and changing settings without understanding what they are for or how they work, usually get to wrong impressions and conclusions thinking it got better when it does nothing or actually make things worse.
I said it can be misleading, not that it actually is.
When someone believes a certain game setting improves things, a blind test will show if it's placebo or not.
9 correct of 10 attempts shows someone is right.


One relief of that aspect is the fact that the OS is mostly closed source, and only a limited amount of 'tweaking' can be done on it!

Well, after months I give up. I'm gonna stick W7 and pray one day W10 will mature enough (And find damn solution for forced windowed vsync, another reason to avoid the OS imho for older games like CSGO)

Its just frustrating as one expects a new OS to be better than the previous, or at least equal at that. But I guess its just too complex. Askin myself continuously, what really improved if my CPU bound games even win in an older OS. :-(

If there's a roughly 5% performance diff, that might very well be corresponding to part of the rendering / input chain code being altered.
MT_
 
Posts: 42
Joined: 17 Jan 2017, 15:39

Re: Comparing bare-bones Win7 vs 10 install

Postby MatrixQW » 06 Feb 2019, 15:10

Chief Blur Buster wrote:If anyone is willing to do these kinds of tests in such depth, I'm interested in poring over those charts.

I would if i had the tools.
'I fast reaction' could volunteer, he seems to have it all covered. If he is willing of course.
I think QuakeWorld can be a good game to make these tests. It has a bultin server so ping is 0. And you can create the environment you need with this game.

MT_ wrote:One relief of that aspect is the fact that the OS is mostly closed source, and only a limited amount of 'tweaking' can be done on it!

Most things people change just affect cpu overhead and the impact is minimal, only too much of it can cause a spike wich will be noticeable, i really doubt it can increase input lag.
You shouldn't need to touch system timers for example. Windows decides wich one to use based on the hardware you have.
'I fast reaction' recommends to disable HPET in Bios but then enables it in the system. Makes no sense.
Last time i checked, Windows used HPET+TSC at boot and then only TSC in the system. So you don't need to disable it in BIOS and shouldn't enable it in the OS as the only timer.
Games/applications that use QueryPerformanceCounter could get reduced performance if HPET is enabled in the OS.

MT_ wrote:Well, after months I give up. I'm gonna stick W7 and pray one day W10 will mature enough (And find damn solution for forced windowed vsync, another reason to avoid the OS imho for older games like CSGO)

Why don't you play true fullscreen?
Doesn't the input lag come from that windowed mode?
Seems there are alot of CSGO players. Do they play in W7 too?
MatrixQW
 
Posts: 99
Joined: 07 Jan 2019, 10:01

Next

Return to Input Lag

Who is online

Users browsing this forum: No registered users and 1 guest