Exactly, no control group whatsoever.Sparky wrote:What happened there is Linus is using a very tedious and time consuming method of testing, so it harmed the sample size(he had what, 4 samples per test?) He also didn't implement a proper control. (v-sync off, and a CRT as a comparison point on both GPUS, or at least SOME reference monitor that was tested on both systems). Then there's the question of how he controlled for framerate.Haste wrote:btw jorimt, whats your take on that video from LinusTechTips that has almost 800K views.
https://www.youtube.com/watch?v=MzHxhjcE0eQ
His results are really weird.
Any idea what happened there?
If you take the v-sync off 45fps results for example, Nvidia went 73 73 72 73 right? Looks super consistent right? Well no. That right there is proof the sample size is far too small, as at that framerate you get plus or minus 11ms just from the framerate alone.
Then there was the time they tried to test latency on the Steam Link: http://forums.blurbusters.com/viewtopic.php?f=15&t=2831
What they were aiming to test was input latency, and instead ended up recording the variances between the system and monitor configurations. AMD and Nvidia cards have very different frametime performance, for instance.
Pushing each system to its limit at 45 fps on Crysis 3 effectively meant when there was a frametime variance, each system had nowhere to go but down. They should have tested with CSGO or the like, so that they could have sustained far higher frames, and capped at the desired limit.
What many don't understand, is even with v-sync off at 300+ fps, there is often large input latency variations from shot to shot; one shot can have as high latency as double buffer v-sync, and the next could be nearly instant. It's the averages over dozens of samples that count, and you must have a control group or the results mean nothing.