TestUFO Display Lag Test coming! (SMTT accurate)

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: TestUFO Display Lag Test coming! (SMTT accurate)

Post by Chief Blur Buster » 26 May 2020, 13:08

flood wrote:
25 May 2020, 11:59
i feel like pretty much all camera nowadays have rolling shutter.
Just now, I have already visualized a rolling-shutter-detection technique (that the TestUFO Lag Tester may enforce).

Just like my inventions of pursuit camera and frame skipping detector (which may also be built-in to the TestUFO lag tester, to more easily discard bad lag numbers), I'm able to invent various kinds of accuracy-verification mechanisms usable by a web browser.

TestUFO is shockingly brilliant because of the accuracy-verification techniques I've invented (including heuristics to self-detect whenever Chrome Browser stutters) -- it is the world's most trusted browser based web test.

With such position in Blur Busters, comes extreme responsibility in launching a TestUFO lag test that is as equally difficult to use as SMTT. And enforcing the user to do a rolling-shutter verification test built into the lag test, preferably in the same photograph as the lag test

(so publicly posted photographs can be quickly discarded if it shows any one of the following: scanout-distortion error indictor, the exposure-too-long error indictor, or the frameskipping error-indicator ... I intend to build ALL indictators into the SAME photograph! Not an easy engineering feat, but my brain has successfully come up with all error indicators)

I am able to mentally emulate new TestUFO tests in my brain (Just like some people have photogenic memory, I have the ability to emulate a motion test in my human brain long before it has been developed. That's how I invented Sync Track, as well as the optical illusions at www.testufo.com/eyetracking nd www.testufo.com/persistence ...amongst other tests)

To protect Blur Busters reputation,

Blur Busters does not want to pollute the Internet with incorrect latency tests.
There shall be no exceptions.
PERIOD. FULL STOP.


That's why TestUFO is stupendously strict about error-margin-embedding-into-photos (stutter detector indicator, etc) -- it's part of the TestUFO Magic Sauce.
flood wrote:
25 May 2020, 11:59
one solution is to have the camera rolling shutter direction perpendicular to the monitor's scanout direction, and stack the monitors vertically
Not necessary!

Simply putting the monitors side by side, photographing landscape, and backing away from the monitor so that both monitors only fill 1/4th the height of the photograph, allows a 1/240sec rolling shutter to become a ~1/1000sec rolling shutter. That's because it takes 1/4th the time to rolling-shutter 1/4th picture height.

The scanout error-verifier needs to embed itself into the resulting TestUFO Lag Test photograph -- so if a user photographs too close, it will show. The user can just simply back further away from the monitor, and photograph again. Voila.

The error margin is measurable, so

IF photo doesn't have browser skipping (existing TestUFO stutter alarm)
AND IF photo doesn't have frameskipping (single-refresh version of existing test )
AND IF photo doesn't have exposure-too-long indicator (new TestUFO error verifier)
AND IF photo doesn't have scanout distortion indicator (new TestUFO error verifier)
THEN TestUFO Lag Test is likely valid (with caveats)

All embedded in the SAME photograph (thanks to my mental ability to emulate TestUFO test in my brain long before I create them) :D

I can then easily dismiss those bad photographs. TestUFO stays trustworthy as a result!

My visualization of a modified frameskip test is a single-frame detector (not multiframe detector) since photographing will overlap only 2 refresh cycles, and I just need to verify that they're both adjacent refresh cycles on both displays. This may miss frameskips that occur in non-displayed refreshes for large lag differentials (e.g. monitor A showing fragments of frame #1/#2 and monitor B showing fragments of frame #4/#5) but as long as portions of all contiguous frames (#1/#2/#3/#4), fragmented on either display, then a frameskip almost certainly didn't happen, and repeat photographs will easily reconfirm consistent lag differentials for sample-and-hold displays.

There are other error margins, such as latency induced by mirroring (which may be an active repeatering operation, or an active GPU driver operation), but most of this can be caught by simply swapping the two physical monitor connections afterwards and re-testing, making sure that multimonitor numbers correctly invert (2 becomes 1, and 1 becomes 2) when you re-plug, before re-mirroring. Typically, most of the "mirroring lag error" can be caught this way.

There are additional black-box limitations, like if you're comparing a HDMI-only display versus a DisplayPort-only display, but even the GPU mirror lag can be pre-verified with a couple of duplicate DP displays or HDMI displays -- the mirror lag is often less than 1ms in many cases nowadays, but more testing is needed to verify if that's consistent throughout many high-performance GPUs. We're mainly interested in lag differentials (A versus B) so a fixed mirror lag for both displays is fine, since it becomes equal lag offsets.

This is in addition to my new SMTT-matching algorithm that I've come up with, that allows replicating SMTT with simple VSYNC ON (framerate=Hz motion, rather than 1000fps VSYNC OFF). It's still important to remind users that this is a mainly latency-differential test (A versus B), rather than an absolute-latency test, though a CRT is frequently used as a zero-lag reference, to determine absolute lag with a lag-differential algorithm like SMTT.

Blur Busters prefers to release a different lag test before releasing the TestUFO lag test
However, I prefer to release the commercial Blur Busters lag-testing hardware accessory, before I release this TestUFO lag test. For now, we use the accessory as part of testing monitors internally for the Blur Busters Approved programme (kind of an internal private beta test).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Thomas Thiemann
Posts: 1
Joined: 03 Jun 2020, 09:24

Re: TestUFO Display Lag Test coming! (SMTT accurate)

Post by Thomas Thiemann » 03 Jun 2020, 09:57

Hello!

I just found this interesting discussion about a new input lag test. Nice work and I highly encourage the work.
Especially as I stopped development and sales on SMTT many years ago it's good to keep that topic alive and it is also a good idea to investigate new ideas.

Just some informations:
When cloning displays within the driver the output of the graphics card are not just delayed by a fixed margin. Each monitor performs it's own synchronization with the graphics card. So you may see different delays just by "unplugging" and "replugging" the monitor without swapping anything.
Just taking the average is not the right approach as these delays have nothing to do with input lag and are based on the temporal delay the data is sent to the monitors (if vsync is enabled).

Theory:
You have two identical monitors -> Your measured difference should be 0.
You take your first measurements: 8 ms lag
You swap the monitors on the outputs and measure again: 2 ms lag.
Taking the average: (8+2)/2 = 5 ms.

Then you get 5 ms instead of 0 ms.

That's a systematical error as you would most likely see values between 0 and 16 ms at screens with 60 Hz refresh rates and identical lag.
If you have a real input lag difference of 5 ms (still 60 Hz monitors) you will see measurements between 5 and 21 ms.

Just taking an average is not the answer to systematical errors that add up to the real value. Taking averages is reasonable if you have a standard distribution. So if your measurement itself has some sort of variation *around* the real value.


There are other effects as well: Sometimes you have slight differences in the refresh rate. I don't talk about 60 Hz vs 75 Hz or 60 Hz vs 59 Hz, I am talking about digits like 1/10th or even 1/100th of 1 Hz.
This results in one monitor "overtaking" the other from time to time. There are quite some screenshots of oscilloscope measurements in the old SMTT 2.0 article on PRAD.de (the english version seems to be no longer available :-( But the pictures should be enough information to understand the issue).
Using two seperated outputs of the graphics card and just cloning the screen is not a reliable way to get synchronized output. Neither synchronized V-Sync pulses nor exactly matching refresh rates.
So you will have to compensate that or you will always get higher values than SMTT. Especially if you just do the swap & average approach. You will get systematically higher values than the true input lag.

Oh: And if you do the measurements with unsynchronized low latency monitors but "v-sync" you may - no, you will even see negative values from time to time. That's not wrong measurement, that's nothing to "average out", that may happen because of delay and/or slightly different framerates so that you compare old vs new, new vs old or new vs new frames.


Disclaimer: I do NOT encourage the use of smartphones with SMTT - no matter which orientation.

I'd really like to know how you do the output within the browser. WebGL? And how do you circumvent all the problems with vsync? In Linux you usually can't disable vsync. In all modern versions of Windows (2D) you can't disable it either.
So I am really interestest how you solved this tricky part while using v-sync as the picture will be rendered at one time and then be stored in the output buffer of the graphics card - unchanged - until it has been completely displayed and then replaced by a new picture. There is no update of the output buffer of the graphics card when v-sync is enabled. So I am really curious about the solution and hope to learn some new tricks - even though I won't use them. (Development of SMTT will not start again)

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: TestUFO Display Lag Test coming! (SMTT accurate)

Post by Chief Blur Buster » 30 Jun 2020, 20:34

Thomas Thiemann wrote:
03 Jun 2020, 09:57
I just found this interesting discussion about a new input lag test. Nice work and I highly encourage the work.
Especially as I stopped development and sales on SMTT many years ago it's good to keep that topic alive and it is also a good idea to investigate new ideas.
Welcome, author of SMTT!
It's a honor to have you around. I was able to clone SMTT results using VSYNC ON (framerate=Hz), while still being able to measure sub-millisecond latency differentials.
Thomas Thiemann wrote:
03 Jun 2020, 09:57
Theory:
You have two identical monitors -> Your measured difference should be 0.
You take your first measurements: 8 ms lag
You swap the monitors on the outputs and measure again: 2 ms lag.
Taking the average: (8+2)/2 = 5 ms.
Yes, I have mentioned that quite a few times already -- this be true. Mirroring latency asymmetries are an issue, and that will be a best-practice recommendation.
Thomas Thiemann wrote:
03 Jun 2020, 09:57
There are other effects as well: Sometimes you have slight differences in the refresh rate. I don't talk about 60 Hz vs 75 Hz or 60 Hz vs 59 Hz, I am talking about digits like 1/10th or even 1/100th of 1 Hz.
Indeed, I am very familiar with this. It actually becomes visible in the TestUFO Refresh Rate Test, www.testufo.com/refreshrate
Thomas Thiemann wrote:
03 Jun 2020, 09:57
Using two seperated outputs of the graphics card and just cloning the screen is not a reliable way to get synchronized output. Neither synchronized V-Sync pulses nor exactly matching refresh rates.
This is also part of the reason why I haven't released the SMTT-cloning TestUFO Test.
Thomas Thiemann wrote:
03 Jun 2020, 09:57
So you will have to compensate that or you will always get higher values than SMTT.
Numbers are identical to SMTT! It's not higher.

Unsynchronized is no longer necessary to get SMTT-accurate results, thanks to a new algorithm I've invented.
Thomas Thiemann wrote:
03 Jun 2020, 09:57
I'd really like to know how you do the output within the browser.
It's just plain old VSYNC ON, already standardized in HTML5, but with a bit of proprietary heuristics to self-detect stutters to help auto-invalidate test results automatically (which improves TestUFO scientific trust, and has created peer reveiwed conference papers).

But any DirectX executable can do VSYNC ON too, and can now replicate SMTT numbers with just framerate=Hz without needing 1000fps VSYNC OFF.

What I came up was an algorithm to clone 1000fps VSYNC OFF latency results with just plain old VSYNC ON framerate=Hz (60fps for 60Hz, or 120fps for 120Hz, etc), while still getting ~1ms accuracy with a 1/1000sec SLR camera shutter. It doesn't matter if HTML5 or DirectX or OpenGL, my algorithm successfully manages subrefresh accuracy without needing high frame rates.

Tom, I'd rather tell you my algorithm by email, please email mark[at]blurbusters.com
EDIT: I just emailed you at your smtt@ address.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Spencer
Posts: 2
Joined: 09 Jul 2020, 04:33

Re: TestUFO Display Lag Test coming! (SMTT accurate)

Post by Spencer » 09 Jul 2020, 05:34

Hello, I have a question for you two Thomas and Mark!

Display lag tests among the various websites and now YT channels show a significant distinction:
  • those that report the middle bar, or average of the three, or an equivalent (typical Leo Bodnar or Time Sleuth users, photodiode)
  • those that report the top, or wherever the scan begins at the first drawn pixel or scanline of a frame (typical camera users with timer or SMTT/blurb)
- The comon invoked justification for telling only the middle or center measurement is that "the center of the display is where we stare the most".
- Camera-type test users who pioneered the testing then only ever compared the actual difference or lag against a reference lagless display (usually CRT)

My problem is that I feel the first and current popular habit of focusing at the center of the frame is gamer's bias which led most people wondering about display lag to forget or generally ignore from the start that this middle~center measurement is a composite figure, sum of the actual display's processing lag before or right at the start of the frame, of respinse time, maybe other minor things, and also a portion of the frame's already drawn/elapsed time.

So we have websites like rtings or displaylag reporting average or center equivalent, and monitors testing websites like tftcentral and pcmonitors reporting top, only actual difference vs. a lagless reference display.

My problem with that is that it creates a discrepancy that people are not aware of, for instance it is very common to read users making statements of faith like "flat panel displays always have some lag while CRTs don't it can't be helped", or "TVs always have some lag left while monitors don't, just looka t the measurments!"

Elsewhere I've read some engineer who's about to produce yet another testing device, out of which he decided to call the total sum of actual lag + response element + scanout..."true lag". The inevitably much higher figures and terminology people will read about, are likely to add some more confusion to an already confused internet opinion on the topic of displays lag.

It would be great that the experts who shape people's opinions could clarify for the world the reality of lag testing, and agree on the terminology, while it is still somewhat feasible.
Or are you, gentlemen, alright with the present displays enthusiast demographics already being confused about what actually deserves to be called 'display lag' or 'input lag', and the future guaranteed to become an even more undecipherable mess where popular opinions rule, not science ?

IMHO the responsibility of at the very least the most influencal among you representatives of the hard-science-led side of displays tech enthusiasm, would be to get together into a conference and streamline the topic, methods, and terminology so the general people would know how to find more-reliable-than-social-media-noise-and-streaming-chaos knowledge references for learning.
Because you know these days, YT random joes with barely enough knowledge can actually build more influence than actual expert engineers on forums (a format that's very unfortunately practically obsolated by social media), so if one very popular yt streamer ever states lag is increased by glasses or 5G towers, many will actually believe that, and I don't think I am exaggerating.


TL;DR websites even the most prominent do not agree on what to call display's lag or delay, there are obvious discrepancies of several miliseconds between reports, due essentially to some always taking measurement at the top, some instead always at the middle or doing average of three spots, therefore a difference of +/- half-a-frame. It is splitting "the internet of display lag reports" in two (only two...so far!) and confusing people, therefore I am asking you experts; what are you going to do about it?

Sorry if this post sounds harsh, but I think this is a real issue that's growing big. I love what you do anyway, of course! :mrgreen:

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: TestUFO Display Lag Test coming! (SMTT accurate)

Post by Chief Blur Buster » 09 Jul 2020, 16:38

Spencer wrote:
09 Jul 2020, 05:34
Sorry if this post sounds harsh, but I think this is a real issue that's growing big. I love what you do anyway, of course! :mrgreen:
I am very familiar with this.

VSYNC ON lag test methods (VBI stopwatch start, center photodiode stop, framerate=Hz) are good for measuring a display suitability's for gaming consoles.

VSYNC OFF lag test methods (frame present stopwatch start, or top-edge monitoring, both usually produce identical numbers), are good for measuring a display's suitability for CS:GO esports.

Latency gradients can also change after repeated measurements:

(100 measurements, averaged, for displays with cable scanout syncing to panel scanout)
NIon-strobed VSYNC ON can create top < center < bottom
Strobed VSYNC ON can create top = center = bottom
Non-strobed VSYNC OFF can create top = center = bottom
Strobed VSYNC OFF can create top > center > bottom
(Excludes latency-gradient distortions such as Quick Frame Transport)

We are working on a new latency measurement standardization, Strobed VSYNC OFF can create top > center > bottom
Much of my talk about lag measurement standardization is in this thread. At the right & appropriate time, we'll be making some announcements.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Spencer
Posts: 2
Joined: 09 Jul 2020, 04:33

Re: TestUFO Display Lag Test coming! (SMTT accurate)

Post by Spencer » 12 Aug 2020, 04:05

Chief Blur Buster wrote:
09 Jul 2020, 16:38
VSYNC ON lag test methods (VBI stopwatch start, center photodiode stop, framerate=Hz) are good for measuring a display suitability's for gaming consoles.

VSYNC OFF lag test methods (frame present stopwatch start, or top-edge monitoring, both usually produce identical numbers), are good for measuring a display's suitability for CS:GO esports.
Well! :D that is precisely what I do not agree with. Center/middle measuring does not make any sense and misleads people since that measurement is a composite of both actual real input lag (time between the monitor's input and the start of frame drawing) and a portion of the frame.

Gamer's/e-sports cognitive bias like "we mostly look at the center so we decrete this is where lag is" doesn't make any sense technically speaking, this is what most clueless reviewing websites base their lag tests in reviews, what they use lag tester devices for, and this is all the fault of people following popular opinions instead of pure science, and it has led them to misunderstand.

Like the widespread beluief that monitors are always less laggy than TVs no matter what, but what they see is actually websites measuring correctly the real lag at the very beginning of the frame, and others measuring the middle because they've been influenced by gamer's biased opinion.

Gamers do the stupidiest things, like playing without vsync is frankly mediaval nowadays, your work proves it enough, yet many, many still play with vsync off and no compensatory method against tearing just because they hold on to the undying beilef that it must always be off, because that's "the aware gamer thing to do".
It's lame, and since when because the earliest drawing of the frame might happens in a different place does that way make any more sense or a rational measurable difference ?
Time between the input and start of the frame with same source signal only w/ vsync off doesn't change the display's properties just because people follow popular oipinion that provides them with placebo benefits.

A display produces a specific amount of delay on X input and for Y input signal, and it's shouldn't be misinformed's people's beliefs and bias to bend reality at will stating reality is different than that.

As I said the main issue is how the internet 'influencers' these masses follow deal with that, people these days only trust popular icons, and if actually serious reviewers doing the right thing like on tftcentral, pcmonitors and a number other are ignored, and don't do anything to straighten things up, then the 'fakes news' win.

I've see people on YT talking tech but refusing to go against the popular opinions because they are afraid of the possible relationship issues with other popular influencers, and of negative impact on their views.
We live in a pathetic era, the real heroes we need are people whow don't kneel to that.

EDIT: There is no 'pc gaming' on one side with one truth and method of measuring, and 'console gaming' on the other with a different set, nor there is any fundamental technology difference that puts apart TVs from monitors, both can be as lagless as CRTs wih the proper design and/or features.
When it comes to measuring the input delay of a diplay, it's the time from the X type of video input to the first pixel of the Y frame that is drawn that should be measured and called lag or delay, any deviation from that whether in method of measuring or terminology, is just extra information or people adding their own bias to the real thing.
Rtings, displaylag, several websites and a now vast bunch of YT channels and influencers in communities etc, who look only at the center/middle results and call it the input lag of a display, are all wrong being off by several miliseconds, and people should stand against all that misinformation.
Chief Blur Buster wrote:
09 Jul 2020, 16:38
We are working on a new latency measurement standardization, Strobed VSYNC OFF can create top > center > bottom
Much of my talk about lag measurement standardization is in this thread. At the right & appropriate time, we'll be making some announcements.
Well thumbs up to that but as I said I just hope it'll make things clear with the terminology, so the world doesn't get yet one more mislabelling results/ therfore misleading lag test.
Apologies for the harshness again, but you video tech gurus do not actively fight against a myth that has infected pretty much the majority of people's understanding and they have accepted it as the flawed truth.
Rtings and diplaylag have done the most damage before, and today it's largely YT clowns spreading and reinforcing the myth.

User avatar
RealNC
Site Admin
Posts: 3757
Joined: 24 Dec 2013, 18:32
Contact:

Re: TestUFO Display Lag Test coming! (SMTT accurate)

Post by RealNC » 16 Aug 2020, 05:17

Spencer wrote:
12 Aug 2020, 04:05
Gamers do the stupidiest things, like playing without vsync is frankly mediaval nowadays, your work proves it enough, yet many, many still play with vsync off and no compensatory method against tearing just because they hold on to the undying beilef that it must always be off, because that's "the aware gamer thing to do".
It's lame, and since when because the earliest drawing of the frame might happens in a different place does that way make any more sense or a rational measurable difference ?
Time between the input and start of the frame with same source signal only w/ vsync off doesn't change the display's properties just because people follow popular oipinion that provides them with placebo benefits.
Vsync lag has nothing to do with display lag, and it's certainly not placebo. Vsync lag is huge.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: TestUFO Display Lag Test coming! (SMTT accurate)

Post by Chief Blur Buster » 17 Aug 2020, 12:37

RealNC wrote:
16 Aug 2020, 05:17
Vsync lag has nothing to do with display lag, and it's certainly not placebo. Vsync lag is huge.
Well...It's not as simple as a yes and no.

Sync technology lag is very integrated with display lag. This is especially true with the boom of co-operative behaviors between drivers and monitors, thanks to variable refresh rate technologies.

FreeSync! G-SYNC! VESA Adaptive-Sync! Fast Sync! Enhanced Sync! VSYNC ON! VSYNC OFF! Some of these sync technologies actually co-operate with the monitor more, and treat the Present()-to-Photons as one co-operative monolithic bloc that can't really be easily demarcated, and the demarcation point can vary (closer to drivers, closer to monitor) based on different sync technologies!

Because of this, I've long refactored my perspective of display lag to essentially be Present()-to-Photons lag, because the demarcation point between driver-and-display varies hugely with different sync technologies and strobe technologies. Even the pixel visibility sequence can change, and latency gradients can invert, with different settings.

Basically the drivers and the display are acting as one co-operative machine, i.e. thanks to emergency of VRR, drivers are now controlling the display behaviors much more than 20 years ago. So my new preferred lag demarcation point for real-world latency is the boundary between the application and the display driver -- aka Present() or glxxSwapBuffers() or whatever method the app sends the framebuffer to the hardware. It greatly simplifies real-wordness of latency tests.

But in the real world, we've got a special situation where not all pixels refresh at the same time on the two-dimensional plane of a computer screen, and that various settings tweaks to various sync technologies, ultimately affects the Present()-to-Photons black box. Since the computer software has to present a screen, and the photons have to hit the human eyes, the sync and strobe technology changes can actually do weird things like invert the latency gradient (bottom edge having less lag than top edge, or vice-versa -- on the same screen -- on the same GPU).

That said, it is true that VSYNC ON and VSYNC OFF classically has nothing to do with whatever display processing occurs, so you could just demarcate then and there. But now you've got display settings such as strobing, which adds a layer of strange interactions that are separate and layered on top. And now changing some of these settings now changes unexpected variables on some models and not on others -- e.g. scanout velocity, lag differentials between top and bottom edge, etc.

Thus, fundamentally, for real-world human-lag-useful latency numbers, I now have a preference towards Present()-to-Photons measurement methodologies rather than synthetic display-only latency.

TL;DR: I'm no longer a fan of synthetic lag benchmarks that exclude sync technologies. There is a reason I now call most display-lag-only benchmarks "synthetic" lag benchmarks, much like CrystalDiskMark is a synthetic disk benchmark. It has less bearing on esports performance than Present()-to-photons latency tests -- aka display lag tests that includes the driver-side co-operative behaviors.

Nontheless, the new lag measurement standards will accomodate all the above (driver-to-display, as well as display-only).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply