Is eye tracked motion blur measurable?

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
User avatar
Chief Blur Buster
Site Admin
Posts: 10731
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Is eye tracked motion blur measurable?

Post by Chief Blur Buster » 24 Oct 2022, 18:15

Discorz wrote:
24 Oct 2022, 14:44
Chief Blur Buster wrote:
24 Oct 2022, 10:52
The answer is yes.
Yes? But what about overlapping?
I'm a little confused by your question -- can you rephrase?
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter
To support Blur Busters - see Multiple Lists of Best Gaming Monitors
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Discorz
VIP Member
Posts: 900
Joined: 06 Sep 2019, 02:39
Location: Europe, Croatia
Contact:

Re: Is eye tracked motion blur measurable?

Post by Discorz » 25 Oct 2022, 01:47

Chief Blur Buster wrote:
24 Oct 2022, 18:15
I'm a little confused by your question -- can you rephrase?
Last chart explains the overlapping nicely. Pure GtG (red) is disrupted by MPRT (light green). So if I'm thinking right one single curve cannot be extracted.


EDIT:
Last chart not accurate.

Just figured. GtG is actually overlapping throughout full MPRT time - image bellow shows each pixel column changes color while GtG being only one third of MPRT. In this case GtG is overlapped by itself 15 times and by MPRT. I imagine it as multiple transparent image layers, each layer shifted by 1 pixel to make it look motion-blurry, if that makes any sense.

GtG-MPRT overlapping.png
GtG-MPRT overlapping.png (4.22 KiB) Viewed 1571 times
x4 zoom
Compare UFOs | Do you use Blur Reduction? | Smooth Frog | Latency Split Test
Alienware AW2521H, Gigabyte M32Q, Asus VG279QM, Alienware AW2518HF, AOC C24G1, AOC G2790PX, Setup

User avatar
Chief Blur Buster
Site Admin
Posts: 10731
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Is eye tracked motion blur measurable?

Post by Chief Blur Buster » 26 Oct 2022, 20:07

Discorz wrote:
25 Oct 2022, 01:47
Just figured. GtG is actually overlapping throughout full MPRT time - image bellow shows each pixel column changes color while GtG being only one third of MPRT. In this case GtG is overlapped by itself 15 times and by MPRT. I imagine it as multiple transparent image layers, each layer shifted by 1 pixel to make it look motion-blurry, if that makes any sense.
Actually, you are right, and that's true.
GtG and MPRT overlaps each other, so it's essentially a composite of GtG and MPRT combined.

I forgot to mention that you can subtract MPRT from GtG using mathematics, by subtracting a straight-line synthetic MPRT100% curve from the GtG math. Then the oscilloscope data syncs up. We are able to do this because we know the ideal MPRT100% "curve" (straight line, actually, when using linear nits rather than gamma-compensated) and it can be mathed out of the GtG in the photograph, if you're careful about the phase of MPRT & the phase of GtG -- but you can align to the very first pixel of the blur to do that, assuming you have a good tracking error margin.

Now that being said, a proper unified motion blur measurement metric is a composite of GtG and MPRT together. This will have to come up with a new name separate from "GtG" and "MPRT", but this will be effectively our superior Blur Busters answer to VESA.

Future benchmarks might display more data, like GtG90%, MPRT90%, GtG99%, MPRT99%, and the composite metric.

Yeah, I need to add a pursuit camera sync track to this.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter
To support Blur Busters - see Multiple Lists of Best Gaming Monitors
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Discorz
VIP Member
Posts: 900
Joined: 06 Sep 2019, 02:39
Location: Europe, Croatia
Contact:

Re: Is eye tracked motion blur measurable?

Post by Discorz » 27 Oct 2022, 03:21

Chief Blur Buster wrote:
26 Oct 2022, 20:07
GtG and MPRT overlaps each other, so it's essentially a composite of GtG and MPRT combined.
Exactly.
Chief Blur Buster wrote:
26 Oct 2022, 20:07
I forgot to mention that you can subtract MPRT from GtG using mathematics, by subtracting a straight-line synthetic MPRT100% curve from the GtG math. Then the oscilloscope data syncs up. We are able to do this because we know the ideal MPRT100% "curve" (straight line, actually, when using linear nits rather than gamma-compensated) and it can be mathed out of the GtG in the photograph, if you're careful about the phase of MPRT & the phase of GtG -- but you can align to the very first pixel of the blur to do that, assuming you have a good tracking error margin.
"Eye curve" is actually a straight line? This is good information. Makes things little less complicated.

One pure/unoverlaped GtG is definitely hidden somewhere in there but as far as I'm aware this mathematical extraction process has never been done before. So we have no instructions/formulas to start with. VESA probably tried to come up with something along these lines. Being so complicated they decided to stick with camera footage as reference with their new Clear Motion Ratio (CMR) certification. But who knows. Software approach where we import photo diode GtG data would be ideal as you said.
Chief Blur Buster wrote:
26 Oct 2022, 20:07
Now that being said, a proper unified motion blur measurement metric is a composite of GtG and MPRT together. This will have to come up with a new name separate from "GtG" and "MPRT", but this will be effectively our superior Blur Busters answer to VESA.

Future benchmarks might display more data, like GtG90%, MPRT90%, GtG99%, MPRT99%, and the composite metric.
Perhaps name it Clear Motion Deviation (CMD). :)
Compare UFOs | Do you use Blur Reduction? | Smooth Frog | Latency Split Test
Alienware AW2521H, Gigabyte M32Q, Asus VG279QM, Alienware AW2518HF, AOC C24G1, AOC G2790PX, Setup

User avatar
Chief Blur Buster
Site Admin
Posts: 10731
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Is eye tracked motion blur measurable?

Post by Chief Blur Buster » 27 Oct 2022, 07:26

Discorz wrote:
27 Oct 2022, 03:21
"Eye curve" is actually a straight line? This is good information. Makes things little less complicated.
I'm not talking about eye curve. MPRT is not an eye curve. I don't know where you got that? (Can you tell me)

Also, for convenience sake, we are syncing up a motion metric to a photograph -- Any nonlinearities and eye curve (and differences between different humans), affects a photograph as equally as the real thing, and different people have different responses to displays. The motion blur interpreted by eyes on a pursuit photograph vs real display would be the same (as long as both had the same dynamic range; i.e. photograph captured everything, and the photograph's gamut was displayed in the same way as original display).

The major "curve" error margin that many people are seeing in measurements is caused by gamma correction. Remember in 8-bit color space (0-255), RGB(25,25,25) isn't one-tenth brightness of RGB(255,255,255). Because of gamma 2.2, it takes RGB(88,88,88) to get the 10% number of photons as RGB(255,255,255) .... an important fact that wild goose red herring away from any subtler "eye curve" considerations, or some accidental invention of an "eye curve" by some middleman.

Eye response differences is important, but something certainly got lost "in the translation", since both GtG or MPRT are electronic measurements, that has nothing to do with eye curves. I wonder how "eye curve" got injected into this conversation?

__________________

Or is it a translation mistake by Google Translate? aka, "eye curve" = "gamma curve" synonym?

If that's the case, I've got news for you:
The gamma formula is really simple, where 255 means the max linear digital brightness of an 8-bit colorspace:

y = 255 * (x / 255 ) ^ gamma)

And the inverse to un-gamma-correct, to get the original value of x again!

x = 255 * (y / 255 ) ^ (1 / gamma)

Either way, it's ALWAYS a straight line without the gamma curve. The fully bidirectional gamma correction formula turns the curve into a line, and vice versa.

Also, MPRT and GtG is done lumens-based (Aka gamma 1.0, no curve).

Bottom line, the ideal MPRT100% squarewave (or the photon accumulation diagonal line from the refresh cycles of a perfect squarewave display) can be easily subtracted from a pursuit camera measurement, to get at the unfiltered GtG data. Not even all researchers realize it's that simple.

Coincidentally, MPRT(0->100%) is the super simple Blur Busters Law formula too! For every 1ms of MPRT100% = 1 pixel of motion blur per 1000 pixels/sec (if GtG=0). And since MPRT is photon-based, and we assume a constant number of photons per X (X=nanosecond) emitted from a perfect squarewave-refresh sample-and-hold display, and the photon accumulation is a perfectly straight diagonal line, if we're 100% ignoring the gamma curve.

The gamma correction is simply a convenience to help with the nonlinear human response to brightness, but it's just a simple math formula. However, GtG and MPRT are specified to be photon-based measurements from their original invention.

But some reviewers make a mistake and measure GtG or MPRT as gamma-corrected, meaning GtG10%->100% of black to white would capture the transition of RGB(25,25,25) to RGB(240,240,240), which would gamma correct to RGB(88,88,88) to RGB(248,248,248) if you're going by photon-intensity based measurements (as with a sensor).

Also, remember photographs can be gamma corrected by the camera, and can distort things. However good cameras (SLR) have fairly liner photon response, and it's just merely a photograph of a gamma-corrected display. As long as you know what the gamma you're using for the photodoide, and what the gamma you're using for the pursuit photograph, and what the gamma you're using for the display itself, you can correct for all of that. Generally photographs from a SLR within its dynamic range, is linear. Since a digital camera sensor is just simply an array of millions of "tiny photodiodes" -- in a manner of speaking. As long as the dynamic range of the photo exceeds the dynamic range of the display (no black and white clipping), you can get the white/black reference, equalize carefully (black-level shift, white-level shift) and gamma-correct as needed to get the 1:1 mapping between a photograph (sensor of many photodiodes) and a photodiode sensor (defacto 1-pixel monochrome camera). See?

Now cameras can add unwanted gamma correction, which you have to disable. (A SLR or manual-everything camera app is useful here). As long as you know the gamma of everything (the screen, the camera, the photodiode), as well as the black/white references, it can all be sync'd up mathematically.

If one has a curve and the other is a straight line, you got a gamma correction problem in your syncing. (Sorry, I forgot to mention you need to gamma-correct when syncing up pursuit camera measurements). You also have to make sure you re-calibrate the zero reference, before you gamma correct or undo the gamma correct though. But you can also simply sync-up the flat line reference before/after the curves, by assuming a specific color X in photograph maps to what was measured by a photodiode tester on a static image of the same color X.

Now there's also times where you correctly measure GtG and MPRT, but instead you decide to map to gamma-corrected values. Then MPRT shows up as a curve. Sure. But that forgets the simple formulaic fact that curve can be easily corrected back to a straight line.

Besides, display gamma correction needed a math formula for the display to function correctly at all!. Also, the gamma correction is easily done as a lookup table, which is classically done ever since the first color LCD displays capable of being connected to a computer, each pixel had to had digital gamma math (or LUT) applied to it, every single refresh cycle.

For analog back in the CRT era it was simply the RAMDAC that created the curve. Back in the analog days with analog cameras and analog tape, it was an attempted linear mapping of signal level to brightness.

But when computers had to digitally represent true color, it was discovered that we needed to use gamma correction to spread things out on the human vision response curve (perhaps this is why you used "eye curve"), in order to compress color representation to only 24-bit, for improved color-shades distribution over the average human eye response.

The world standardized on the standard gamma correction formula in everything (in PhotoShop, in display engineering, etc). Otherwise, getting good quality from 24-bit truecolor would not have been possible, given banding showing up in some parts of the color spectrum if we had stuck with 1.0 gamma.

Anyway, regardless of the confusion, it's just a universal math formula, 2.2 as a boilerplate average of all humans (much like the R,G,B primaries are boilerplates, despite 12% being colorblind, with many more humans having slightly shifted primaries -- e.g. primaries "X.XX"nm different from you).

Ideally, cameras and photodiodes are easiest to sync up if they have the same 1.0 gamma (no gamma correction). If either has a gamma, it needs to be known, and it needs to be corrected for. Thankfully SLR pursuit camera photos, as well as very good "Manual camera" smartphone apps, will generally have very linear response, and can be more easily 1:1 mapped to a photodiode. Hidden undocumented gamma correcting behavior of a camera can be annoying -- e.g. cheap smartphones and default camera app. That why I always recommend people download a manual-camera app for their smartphones, if using pursuit camera, anyway...

You can also approach this differently and photograph static image of a static swatch of 256 grey shades, and then compute your camera sensor's gamma correction (or lack thereof). If it syncs up to the display gamma (i.e. you're getting a 2.2 gamma from the swatch) then your camera is correctly behaving as 1.0 gamma correction. If you're getting 2.4 in the photo and the display is 2.2, then you've got a +0.2 gamma correction behavior in your camera logic, when doing the static-photo verification of a swatch of greys. But this is unnecessary since many cameras have no gamma correction in their manual filterless mode.

Find why the camera is doing that, and disable it (if possible) -- turn off all filters. Worse comes to worse, it is theoretically possible to simply correct for it (camera-specific) if you're stuck with the specific camera. Easiest is to save time with a good camera with correctly linear response, to capture a display's original gamma with no gamma curve modification. My experience with latest iPhone and Galaxy sensors is linear, as long as you are using a manual app to treat it like a SLR camera, disabling the smartypants automatic image enhancements features, and treating it like a SLR with a manual app and setting camera exposure / settings to capture the whole display's gamut. Properly done that is easiest to treat your camera as a no-gamma-correction device (1.0 gamma). That way, your array of photodiodes (digital camera sensor) is linear mapped 1:1 to a single photodiode (oscilloscope), no worrying about curves. It's just photodiodes of a different kind!

Bottom line:
(1) Watch out for your camera's unwanted automatic gamma behaviors, try to disable it (use manual-everything SLR-like modes).
(2) Gamma curves can be mathed out, but you must know the gamma value of that electronic photon emitter (displays) and that electronic photon receiver (cameras)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter
To support Blur Busters - see Multiple Lists of Best Gaming Monitors
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Discorz
VIP Member
Posts: 900
Joined: 06 Sep 2019, 02:39
Location: Europe, Croatia
Contact:

Re: Is eye tracked motion blur measurable?

Post by Discorz » 28 Oct 2022, 05:23

Chief Blur Buster wrote:
27 Oct 2022, 07:26
I'm not talking about eye curve. MPRT is not an eye curve. I don't know where you got that? (Can you tell me)
Its a word for the hold blur. In case with moving edge transition like on first charts where series of frames overlay, hold blur looked like a curve in the RGB world. That's where it came from. Scan-out wise its just the amount of time frame is visible for. I've mentioned "Eye curve" is not a good term. Only unread people like me use it :). Correct me if I'm wrong but that blur does not actually exist, our eyes blur/fill the empty space in between.

I'll have to do some more research on gamma in general.
Compare UFOs | Do you use Blur Reduction? | Smooth Frog | Latency Split Test
Alienware AW2521H, Gigabyte M32Q, Asus VG279QM, Alienware AW2518HF, AOC C24G1, AOC G2790PX, Setup

User avatar
Chief Blur Buster
Site Admin
Posts: 10731
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Is eye tracked motion blur measurable?

Post by Chief Blur Buster » 29 Oct 2022, 00:27

Discorz wrote:
28 Oct 2022, 05:23
I'll have to do some more research on gamma in general.
Gamma correction curves is why RGB(120,120,120) is not exactly half the brightness (in nits) as RGB(240,240,240).

That's why if people program a LCD GtG analyzer incorrectly thinking the start threshold of GtG10% for Black->White is RGB(25,25,25) instead of correctly RGB(~89,~89,~89) or was it 90? Some reviewers mess this up, as VESA GtG 10->90% is actual nits (photons) based, not digital RGB value based.

So if you're pixel response measuring a black to white transition, aka RGB(0,0,0) to RGB(255,255,255) transition, the sensor whether single photodiode sensor or a matrix sensor (camera sensor) will be keyed to voltage levels that map directly to brightnesses.

RGB(25,25,25) is an ultra dim grey less than 2% of full white RGB(255,255,255)

You need to go all the way to RGB(~89,~89,~89) to be ~10% as bright as full white RGB(255,255,255).
The exact gamma-corrected RGB value of 10% grey is, on a Gamma 2.2 display:

255 * (1/10) ^ (1/2.2)

= 89.535389

A solid dark-grey field that is 10% as bright as maximum white -- in 8-bit color space -- rendered in RGB colorspace at gamma 2.2, is either RGB(89,89,89) or RGB(90,90,90) depending on whether you're rounding down or rounding up.

Note the gamma formula (for gamma 2.2) to convert from a photons value to a RGB value is normally 255 * (x/255) ^ (1/2.2) ... the "x/255" part is a percentage! So it can easily be replaced with any fixed percentage you want, like "1/10" for 10% and "9/10" for 90% and calculate it floating-point to get something more accurately rounded off only at the final snap-to-digital-RGB stage. (Always keep maximum precision during intermediary calculations, where possible, and only round off at final step. Otherwise you end up getting things like RGB(88,88,88) like I did earlier rather than the more accurate RGB(90,90,90) when rounding-to-nearest...)

Regardless, study up on gamma-correction curves -- it's an important part of color science, being gamma correction as a tool to spread banding visibility of 24-bit color space. Back in the old days, early digital video compression did not do very good job at gamma-correcting the compression priority, and that why dark images had more compression artifacts than bright images. This is no longer a problem with modern codecs like H.264 and newer, since they now better compensate for the non-linear sensitivity of human vision via gamma correction in compression prioritization in the compressor engine. But remember the very old early DVDs and VCDs, that had ugly macroblocking in dark scenes. Bingo.

Anyway, gamma correction is a conveniently symmetric math formula that can add/undo gamma correction (within reason: rounding errors can build up. But it's not generally a big problem if done only once, for a calculated purpose). And we can do that if we're trying to equalize gamma-mismated light measurements. But here, it's not needed -- assuming the camera is not adding gamma correction (and preserving the original display gamma in the resulting photograph). If the photo is relatively accurately WYSIWYG without looking oddly dark or oddly overexposed, it's probably pretty linear.

Anyway, all this is even irrelevant if you're doing it properly (non-gamma-corrected camera photograph, so it preserves the display's gamma as-is) -- from the photography perspective, a perfect squarewave sample and hold display (0ms GtG), MPRT becomes a perfect straight line with no curves at ends of the line either. So no gamma correction is needed.

That's viewed from linear/analog/photon space, and you can easily subtract a reference MPRT from the curve you plotted from a pursuit camera photograph -- in order to get the unfilted GtG that syncs up to a photodiode oscilloscope. Of a flashing rectangle -- no sidescatter of different pixels -- always use non-motion complete solidness with a 1-pixel camera called a photodiode.

We only got sidetracked to the gamma curve, only simply because you said "eye curve", but digital MPRT (
for GtG=0) is never a curve in lumen/nitspeak, the same space VESA specifies GtG/MPRT measurements to be in light intensity (photon-intensity-space, or nit-space, not RGB-valuespace).

The beauty is that the difference between perfect MPRT and the imperfect curve measured, is exactly just only GtG left over. GtG is why MPRT is never perfect, so the difference is the unfiltered GtG curve, pulled from a pursuit camera photograph. Voila!

It's beautifully simple mathematically. For getting GtG curves from a photodiode oscilloscope, just make sure camera tracking accuracy is as perfect as possible, since camera tracking error will blur the curve -- like reducing the sample rate of a photodiode oscilloscope even further.

<More technical rambley sidetrack for optional reading>
Given equivalent sample rate for pursuit camera photograph (the sample rate is spread over the frame step -- so if 8 pixel step is spread over 200 camera sensor pixels, and your focus is tack sharp, you've got a samplerate 200 samples per refresh cycle. (So at 100Hz, that's an equivalent photodiode oscilloscope sample rate of 20,000 samples per second! Just by using a macro zoom on a pursuit photograph). It's very hard, however, to pursuit very fast on very zoomed images, so you will probably only be able to spread the pixel per frame movement step up to roughly 1/20 to 1/50th the horizontal resolution of your camera sensor, and the bayer filtering is not very aggressive/very scaling. Maybe undersample the resolution slightly for more accurate photodiode-oscilloscope-equivalent sample per horizontal pixel on a pursuit camera image. So if your camera is 4000 pixels wide and you manage to successfully pursuit a 16 pixel per frame moving edge, over 1/20th the width of the camera frame, that would be (4000/20) = 200 pixels = equivalent to ~200 photodiode oscilloscope samples per refresh cycle. And if you stack 4 refresh cycles, it's like doing 4 photodiode oscilloscope runs and averaging the curve together. So the more refresh cycles you're able to stack, the more noisefree your pursuit camera measurement becomes (if your camera sensor has sufficient dynamic range not to clip the blacks / whites). If you're an expert at pursuiting a camera rail at roughly 1920 pixels/sec, with a very good macro zoom you can get essentially the equivalent of 5-digit or 6-digit photodiode oscilloscope sample rate, from a single camera frame! Pretty damn impressive without owning an oscilloscope. You plot it along the curve, then measure the distance between ticks in the sync track (always 4 refresh cycles apart), divide that by 4, to get how many horizontal pixels in the photograph is 1 refresh cycle. Then you subtract the perfect MPRT line from it (over that computed 1 refresh cycle spread), beginning at the very first GtG transition pixel (very edge of blur), to get at the unfiltered GtG data from the pursuit photograph that can sync up to a real photodiode oscilloscope. Simple math magic, if you've followed along! There are error margins involved, e.g. from image scaling, bayer filtering, camera tracking, etc, but it's pretty clear that a good manual pursuiter can easily get multiple-quadruple-digit photodiode oscilloscope sample rate from a pursuit camera photo, even with these error margins. But it degrades very quickly if there's no pursuit camera sync track.
</More technical rambley sidetrack for optional reading>

Yes, I must add a pursuit camera sync track to www.testufo.com/blurtrail -- it would be mandatory to know error margins of calculating LCD GtG.

And yep. Theoretically this could be done as an app! "LCD GtG Measuring App". You'd wave the smartphone in continuous video measuring mode (via 4K30fps, like 30 photographs/sec of 1/30sec exposures), a trained AI neural network in the app recognizes the sync track, saves only the perfect freezeframe(s) automatically, confirms accuracy automatically, and automatically pulls & compute GtG curve. An app becomes your display photodiode oscilloscope.

<Breakthrough idea, albiet difficult-and-niche>
It's definitely mathematically possible to turn an iPhone/Galaxy sensor into a very accurate 10,000+ samples/sec display photodiode oscilloscope and automatically display separated GtG number and MPRT number, as well as display GtG curves (MPRT filtered from the curve).. As long as we automatically compensate for scanskew (detect the tilting in the blur edge in resulting photo), we can average multiple pixel rows of the camera image, subpixel-shifting as necessary, and use superresolution algorithm to get MORE sample rate at even lower noise (16-bit precision is possible -- the Tektronix class) Would require a superlatively good developer (or significant investment funding to Blur Busters), but it's indeed mathematically possible to turn an existing iPhone 10-13 camera into a 16-bit 100,000 samples/sec GtG display photodiode oscilloscope, with this multilayered stacking trick (multiple camera exposure + sync track to stack refresh cycles) AND (using multiple pixel rows in resulting pursuit photo as defacto equivalent of multiple oscilloscope runs embedded into a single pursuit photograph).
<Breakthrough idea, albiet difficult-and-niche>

[I wonder if I'm the world's first person to suggest this GtG measuring app idea -- I bet almost everyone says this is impossible, when this is really not -- it kind of requires the "Blur Busters Display Temporals Brain" to conceptualize how it's mathematically possible].

If any researchers / investors are interested in turning this automatic "GtG graphing" smartphone app reality, please inquire within (www.blurbusters.com/contact ...) I imagine it is very niche and probably won't be big money stuff, so needs unorthodox funding such as researcher-grant-funding (goverment/bountysource) + opensource collaboration to make reality.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter
To support Blur Busters - see Multiple Lists of Best Gaming Monitors
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Discorz
VIP Member
Posts: 900
Joined: 06 Sep 2019, 02:39
Location: Europe, Croatia
Contact:

Re: Is eye tracked motion blur measurable?

Post by Discorz » 05 Nov 2022, 10:20

Are these fading in-out vertical bands pure GtG? It sure does look like it. I captured this with stationary camera. I don't know how to measure the time because I don't have any reference and camera shutter effects GtG length significantly. Sorry for throwing this one in randomly.

Transition RGB 64-191-64
OFF IMG_20221105_154605_1.jpg
OFF IMG_20221105_154605_1.jpg (146.65 KiB) Viewed 1149 times
Balance IMG_20221105_154704_1.jpg
Balance IMG_20221105_154704_1.jpg (146.01 KiB) Viewed 1149 times
Speed IMG_20221105_154743_1.jpg
Speed IMG_20221105_154743_1.jpg (152.78 KiB) Viewed 1149 times
M32Q Overdrives @160Hz/fps, ∼5000pps, s1/400

A tint of red on trailing edge is real. I can also see it when eye tracking. Is it KSF phosphor?
Compare UFOs | Do you use Blur Reduction? | Smooth Frog | Latency Split Test
Alienware AW2521H, Gigabyte M32Q, Asus VG279QM, Alienware AW2518HF, AOC C24G1, AOC G2790PX, Setup

User avatar
Chief Blur Buster
Site Admin
Posts: 10731
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Is eye tracked motion blur measurable?

Post by Chief Blur Buster » 05 Nov 2022, 20:02

Discorz wrote:
05 Nov 2022, 10:20
Are these fading in-out vertical bands pure GtG? It sure does look like it. I captured this with stationary camera. I don't know how to measure the time because I don't have any reference and camera shutter effects GtG length significantly. Sorry for throwing this one in randomly.
Yes, that is almost pure GtG down the vertical dimension*

*with a Caveat:
It's very hard to measure the reference point because of camera sensor scanout-velocity is even more unknown than the display scanout-velocity. So you can't always know how many milliseconds it is, without knowing the camera-sensor-scan-readout velocity inside the camera, but you can still plot a GtG curve by going vertically down the axis -- just without a known clock reference.

A slow smartphone camera may scanout in 1/60sec while the display scans out 1/240sec, creating interesting artifacts when trying to photograph LCD GtG with a stationary camera. Also, it's a single-pass GtG measurement which will always be much more noisy than a multi-pass GtG measurement (the multiple refresh cycle exposure of a pursuit camera).
Discorz wrote:
05 Nov 2022, 10:20
A tint of red on trailing edge is real. I can also see it when eye tracking. Is it KSF phosphor?
Yes, though much fainter than via strobing.

KSF phosphor is known to 'affect' the GtG measurements in a weird way. Only a few percent (sub-millisecond), but it does affect it.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter
To support Blur Busters - see Multiple Lists of Best Gaming Monitors
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply