Is the following test method for input lag valid?

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
Post Reply
deama
Posts: 370
Joined: 07 Aug 2019, 12:00

Is the following test method for input lag valid?

Post by deama » 13 Jun 2020, 23:46

So basically I would like to start doing my own input lag tests, but I would like to be able to test the entire pipeline, from the moment the user wants to press a key, to the moment it registers on screen.

So I was thinking of doing the following: Get the highest hz monitor, so I guess 240hz for now, get a high speed camera that runs at around 1000fps.
I would then get an LED, solder it to a wire, and then stick the wire on a key on a keyboard, then I would basically make a finger glove that connects to a battery and has a connection exposed at the end, so if I put the finger glove on and press the key with the LED on it, the LED would light up.
So essentially I would film myself pressing the key, and the moment the LED lit, I would start counting how long it would take to register an action on screen.

Does that sound valid? Or is there a problem in the above method? I mean, I'd like to get a higher hz monitor, so maybe in the future get a higher hz one, but for now 240hz should be enough for starters.

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: Is the following test method for input lag valid?

Post by Sparky » 14 Jun 2020, 03:59

The problem with this method is that it's a human pressing the button. You need something more repeatable, like an object falling from a consistent height.

Kulagin
Posts: 37
Joined: 27 Feb 2016, 08:17

Re: Is the following test method for input lag valid?

Post by Kulagin » 14 Jun 2020, 05:30

deama wrote:
13 Jun 2020, 23:46
I would then get an LED, solder it to a wire, and then stick the wire on a key on a keyboard, then I would basically make a finger glove that connects to a battery and has a connection exposed at the end, so if I put the finger glove on and press the key with the LED on it, the LED would light up.
So essentially I would film myself pressing the key, and the moment the LED lit, I would start counting how long it would take to register an action on screen.

Does that sound valid? Or is there a problem in the above method?
Not really, because you'd power the LED before you actually press the button. You'd power the LED the moment you touch the button. And that would introduce a variable that affects your results. The variable is how fast you actually press the button. It's only a few milliseconds but when at 240 Hz you're already down to ~10ms latency total at 500+ FPS, do you really want to have +-3ms tolerance? So without changing anything sometimes your average would be 10ms, sometimes 13ms and sometimes 16ms? No, you don't, you don't want to have that variable.

So what you want to do is to solder your button and LED consequentially: you solder anode to one of the button's legs and you solder cathode to the pin on the board where you usually solder your button's pin. This way when you press the button, the circuit will go through the button and your LED(and the LED will light up), this is the moment you want to consider that your button is pressed.

Also, there's a much better method than this: automated tests using Arduino:
phpBB [video]


What Arduino does is it sends through USB a button input, usually it's LMB Down, records the time when it was pressed. At the start player is looking at a dark(black) spot(which is observable by Arduino's photosensors). Game is configured in a way that when the button is pressed, it sets the player's viewing angle in such a way that observable by Arduino area becomes light(white). Arduino records the time when the area became light and then calculates the time between events: timeWhenAreaBecameLight - timeWhenButtonPressWasSent. Then it sends data to PC through COM Port. There's your one test. You do 1000 of those which only takes less than 2 minutes. With a camera it would took you a week to make a 1000 tests or maybe even a month!

Arduino works not just at 1000 Hz or 10000 Hz, like your camera would. Arduino Leonardo works at 16 Mhz but you can go much higher than that using another model. Tolerance on Arduino is +-2us but you can go as low as 0.25us. Compare that to the 1000 FPS camera, which is +-0.5ms. That's 2000 times smaller tolerance on Arduino! You also do automated tests: you just press one button and it will perform the test, record everything, then calculate everything, then write it down for you in an easily readable format.

Flood made the map for CS:GO, called map_flood:
phpBB [video]


I use this config for my tests:

Code: Select all

bind "mouse1" "+lagTest"

alias +lagTest "setang 1 1"
alias -lagTest "setang 1 179"

cl_showfps "0"
cl_showpos "0"
fps_max "500"
Tests look like this, except there are of photoresistors on the monitor measuring light:
phpBB [video]


I do a 1000 measurements for one thing. 1000 measurements is enough and it's super precise. I recently tested polling rate on USB and it can spot a difference between 1000 Hz, 2000 Hz and 8000 Hz polling rates, which is sub 1ms! Here are my results:
Image

I made one on UE4 to test UE4-related things like r.OneFrameThreadLag and r.FinishCurrentFrame and many other vars:
Image
https://github.com/KulaGGin/Lag-Tester-Shipment

If you decide to make this kind of tester, that Rocket Science guy has his Discord server and there's a whole channel dedicated to his input lag tests and building this kind of tester with electronics guys who can help with building a circuit, programming your Arduino, flashing it and assembling it all together, so you don't need to me an electronics and programming professional to build this device on your own from scratch: circuit chemes and code are already ready, you just have to assemble the thing and flash your Arduino.

Post Reply