Discorz wrote: ↑27 Oct 2022, 03:21
"Eye curve" is actually a straight line? This is good information. Makes things little less complicated.
I'm not talking about eye curve. MPRT is not an eye curve. I don't know where you got that? (Can you tell me)
Also, for convenience sake, we are syncing up a motion metric to a photograph -- Any nonlinearities and eye curve (and differences between different humans), affects a photograph as equally as the real thing, and different people have different responses to displays. The motion blur interpreted by eyes on a pursuit photograph vs real display would be the same (as long as both had the same dynamic range; i.e. photograph captured everything, and the photograph's gamut was displayed in the same way as original display).
The major "curve" error margin that many people are seeing in measurements is caused by gamma correction. Remember in 8-bit color space (0-255), RGB(25,25,25) isn't one-tenth brightness of RGB(255,255,255). Because of gamma 2.2, it takes RGB(88,88,88) to get the 10% number of photons as RGB(255,255,255) .... an important fact that wild goose red herring away from any subtler "eye curve" considerations, or some accidental invention of an "eye curve" by some middleman.
Eye response differences is important, but something certainly got lost "in the translation", since both GtG or MPRT are electronic measurements, that has nothing to do with eye curves. I wonder how "eye curve" got injected into this conversation?
__________________
Or is it a translation mistake by Google Translate? aka, "eye curve" = "gamma curve" synonym?
If that's the case, I've got news for you:
The gamma formula is really simple, where 255 means the max linear digital brightness of an 8-bit colorspace:
y = 255 * (x / 255 ) ^ gamma)
And the inverse to un-gamma-correct, to get the original value of x again!
x = 255 * (y / 255 ) ^ (1 / gamma)
Either way, it's ALWAYS a straight line without the gamma curve. The fully bidirectional gamma correction formula turns the curve into a line, and vice versa.
Also, MPRT and GtG is done lumens-based (Aka gamma 1.0, no curve).
Bottom line, the ideal MPRT100% squarewave (or the photon accumulation diagonal line from the refresh cycles of a perfect squarewave display) can be easily subtracted from a pursuit camera measurement, to get at the unfiltered GtG data. Not even all researchers realize it's that simple.
Coincidentally, MPRT(0->100%) is the super simple Blur Busters Law formula too! For every 1ms of MPRT100% = 1 pixel of motion blur per 1000 pixels/sec (if GtG=0). And since MPRT is photon-based, and we assume a constant number of photons per X (X=nanosecond) emitted from a perfect squarewave-refresh sample-and-hold display, and the photon accumulation is a perfectly straight diagonal line, if we're 100% ignoring the gamma curve.
The gamma correction is simply a convenience to help with the nonlinear human response to brightness, but it's just a simple math formula. However, GtG and MPRT are specified to be photon-based measurements from their original invention.
But some reviewers make a mistake and measure GtG or MPRT as gamma-corrected, meaning GtG10%->100% of black to white would capture the transition of RGB(25,25,25) to RGB(240,240,240), which would gamma correct to RGB(88,88,88) to RGB(248,248,248) if you're going by photon-intensity based measurements (as with a sensor).
Also, remember photographs can be gamma corrected by the camera, and can distort things. However good cameras (SLR) have fairly liner photon response, and it's just merely a photograph of a gamma-corrected display. As long as you know what the gamma you're using for the photodoide, and what the gamma you're using for the pursuit photograph, and what the gamma you're using for the display itself, you can correct for all of that. Generally photographs from a SLR within its dynamic range, is linear. Since a digital camera sensor is just simply an array of millions of "tiny photodiodes" -- in a manner of speaking. As long as the dynamic range of the photo exceeds the dynamic range of the display (no black and white clipping), you can get the white/black reference, equalize carefully (black-level shift, white-level shift) and gamma-correct as needed to get the 1:1 mapping between a photograph (sensor of many photodiodes) and a photodiode sensor (defacto 1-pixel monochrome camera). See?
Now cameras can add unwanted gamma correction, which you have to disable. (A SLR or manual-everything camera app is useful here). As long as you know the gamma of everything (the screen, the camera, the photodiode), as well as the black/white references, it can all be sync'd up mathematically.
If one has a curve and the other is a straight line, you got a gamma correction problem in your syncing. (Sorry, I forgot to mention you need to gamma-correct when syncing up pursuit camera measurements). You also have to make sure you re-calibrate the zero reference, before you gamma correct or undo the gamma correct though. But you can also simply sync-up the flat line reference before/after the curves, by assuming a specific color X in photograph maps to what was measured by a photodiode tester on a static image of the same color X.
Now there's also times where you correctly measure GtG and MPRT, but instead you decide to map to gamma-corrected values. Then MPRT shows up as a curve. Sure. But that forgets the simple formulaic fact that curve can be easily corrected back to a straight line.
Besides, display gamma correction needed a math formula
for the display to function correctly at all!. Also, the gamma correction is easily done as a lookup table, which is classically done ever since the first color LCD displays capable of being connected to a computer, each pixel had to had digital gamma math (or LUT) applied to it, every single refresh cycle.
For analog back in the CRT era it was simply the RAMDAC that created the curve. Back in the analog days with analog cameras and analog tape, it was an attempted linear mapping of signal level to brightness.
But when computers had to digitally represent true color, it was discovered that we needed to use gamma correction to spread things out on the human vision response curve (perhaps this is why you used "eye curve"), in order to compress color representation to only 24-bit, for improved color-shades distribution over the average human eye response.
The world standardized on the standard gamma correction formula in everything (in PhotoShop, in display engineering, etc). Otherwise, getting good quality from 24-bit truecolor would not have been possible, given banding showing up in some parts of the color spectrum if we had stuck with 1.0 gamma.
Anyway, regardless of the confusion, it's just a universal math formula, 2.2 as a boilerplate average of all humans (much like the R,G,B primaries are boilerplates, despite 12% being colorblind, with many more humans having slightly shifted primaries -- e.g. primaries "X.XX"nm different from you).
Ideally, cameras and photodiodes are easiest to sync up if they have the same 1.0 gamma (no gamma correction). If either has a gamma, it needs to be known, and it needs to be corrected for. Thankfully SLR pursuit camera photos, as well as very good "Manual camera" smartphone apps, will generally have very linear response, and can be more easily 1:1 mapped to a photodiode. Hidden undocumented gamma correcting behavior of a camera can be annoying -- e.g. cheap smartphones and default camera app. That why I always recommend people download a manual-camera app for their smartphones, if using pursuit camera, anyway...
You can also approach this differently and photograph static image of a static swatch of 256 grey shades, and then compute your camera sensor's gamma correction (or lack thereof). If it syncs up to the display gamma (i.e. you're getting a 2.2 gamma from the swatch) then your camera is correctly behaving as 1.0 gamma correction. If you're getting 2.4 in the photo and the display is 2.2, then you've got a +0.2 gamma correction behavior in your camera logic, when doing the static-photo verification of a swatch of greys. But this is unnecessary since many cameras have no gamma correction in their manual filterless mode.
Find why the camera is doing that, and disable it (if possible) -- turn off all filters. Worse comes to worse, it is theoretically possible to simply correct for it (camera-specific) if you're stuck with the specific camera. Easiest is to save time with a good camera with correctly linear response, to capture a display's original gamma with no gamma curve modification. My experience with latest iPhone and Galaxy sensors is linear, as long as you are using a manual app to treat it like a SLR camera, disabling the smartypants automatic image enhancements features, and treating it like a SLR with a manual app and setting camera exposure / settings to capture the whole display's gamut. Properly done that is easiest to treat your camera as a no-gamma-correction device (1.0 gamma). That way, your array of photodiodes (digital camera sensor) is linear mapped 1:1 to a single photodiode (oscilloscope), no worrying about curves. It's just photodiodes of a different kind!
Bottom line:
(1) Watch out for your camera's unwanted automatic gamma behaviors, try to disable it (use manual-everything SLR-like modes).
(2) Gamma curves can be mathed out, but you must know the gamma value of that electronic photon emitter (displays) and that electronic photon receiver (cameras)