What are the actual UFO panning speeds?

NEW for 2017: Discussion about the testufo.com Blur Busters Motion Tests. Widely used by enthusiasts, display tweakers, YouTubers reviewers, monitor manufacturers and VR headset makers!
Post Reply
User avatar
Discorz
VIP Member
Posts: 999
Joined: 06 Sep 2019, 02:39
Location: Europe, Croatia
Contact:

What are the actual UFO panning speeds?

Post by Discorz » 11 Sep 2021, 14:43

Since the ufo must be panned an integer number of pixels per frame to look smooth, should these be the actual pixel per second speeds?

Why doesn't TestUFO state actual speeds but only pixels per frame?

must be integer.png
must be integer.png (68.03 KiB) Viewed 8532 times
Compare UFOs | Do you use Blur Reduction? | Smooth Frog | Latency Split Test
Alienware AW2521H, Gigabyte M32Q, Asus VG279QM, Alienware AW2518HF, AOC C24G1, AOC G2790PX, Setup

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: What are the actual UFO panning speeds?

Post by Chief Blur Buster » 14 Sep 2021, 17:30

Discorz wrote:
11 Sep 2021, 14:43
Since the ufo must be panned an integer number of pixels per frame to look smooth, should these be the actual pixel per second speeds?

Why doesn't TestUFO state actual speeds but only pixels per frame?
For informative purposes, I'll add an interim mouseover tooltip-style box that shows more statistics ("Target Pixels Per Second" vs "Actual Pixels Per Second") everytime someone mouseovers or clicks the PPF/PPS numbers. 960pps at 144Hz actually ends up being 1008 pixels per second (5% deviation) because of TestUFO's PPF priority behavior. This will provide useful information to TestUFO users.

Original Reason:

When TestUFO was first beta tested in year ~2012, the only refresh rates available were divisors of 60 -- and the VG278HE was released after the internal TestUFO beta was started.

(A) Pixels per frame is extremely useful as it often corresponds to the reference number of pixels of motion blur there should be for a ideal theoretical 0ms GtG sample-and-hold display).

(B) Fractional pixels per frame involves either blended motion (blurry pixels of subpixel position object) or jitter (stutter from rounding error), which adds an error margin to pursuit camera photographjy;

This way, it made it easier for MPRT-estimating-by-eye (for motion-testing experienced people) -- see link at bottom on how the UFO graphic was designed as a test pattern (before it became the Blur Busters corporate logo!). Going into a showroom or a convention of hundreds of displays, I can eyeball and visually estimate the MPRT to within one octave (e.g. MPRT 0.5ms, 1ms, 2ms, 4ms, 8ms, 16ms) even just by glancing how the UFO degrades -- without always needing measuring equipment. Completely unmarred by the jitter/blur error margin of non-exact pixels per frame.

However, Motionspeeds Will Become More Flexible for 2022:

Yes, improvements are coming.

A change is being added to TestUFO next year to give more motionspeed flexibility, and better motionspeed calculations.

I agree that it should be more flexible, with accurately displayed calculations for odd refresh rates to eventually support all possible motion speeds (both integer and fractionals), with two modes of operation (subpixel rendering and snap-to-nearest-pixel rendering), including PPS-priority rendering and PPF-priority rendering modes.

The UX needs to be designed carefully without complicating the easy-mode TestUFO. Plus, the unified TestUFO rendering engine is integer-based, which means I have to edit dozens of source code files to refactor it to support both integer and floating-point operations. Needless to say, it's not a simple code change due to well-intentioned architecture decisions made a decade ago, partially due to the performance of the era didn't allow great subpixel rendering performance yet (at the time).

Some browsers did not even support subpixel rendering at the time, and I was the world's first website to recognize 2012's addition of VSYNC support to web browsers that made browsers usable for motion tests for the first time ever without needing to download an application. I also needed to support full frame rates on low performance Netbooks (Intel Atom!). And users who were using <barf> MICROSOFT INTERNET EXPLORER </barf>. Thankfully, I am no longer constrained by any of these anymore!

This "Advanced Speed Settings" mode of TestUFO requires a significant rearchitecturing of the TestUFO engine, which may not occur until next year due to a physical move (both personal and business), so priority is supporting Blur Busters Approved clients (for now). The long term goal is to allow flexible custom motion speeds for all relevant speed-adjustable TestUFO tests, through a unified motionspeed configurator which may be hidden in a expandable panel (to hide complexity that only interests advanced users).

This minor deviation was scientifically chosen because there is less error margin from a few-percent wrong speed, than from the extra blur (resulting from a scaling-style blur of subpixel rendering if rendering subpixelly, or as blurrier camera exposure of jitter blending into blur if rendering snap-to-pixel).

However, as refresh rates get higher and odder (e.g. 280Hz, 390Hz, etc) the error margin of this year 2012 decision is getting bigger and so there is an expanding need of more flexible motion speed to let the tester pick-preferred poison.

Nominally, it will be potentially simplified down to two hidden settings that appears only when selecting "Advanced..." in "Speed"
Custom Speed:
...Slider/Textbox: Exact motion speed (decimal point allowed)

Speed Preference Select:
...Round Off To Nearest Integer Pixels Per Frame
...Exact Motion Speed, Subpixel Rendered
...Exact Motion Speed, Jitter Allowed

Estimated Error Margin
Possibly with automatically displaying an automatically math-calculated error margin (it's calculatable!) for each of the 3 Speed Preference Selectors. Most of the time, the smallest blur error margin (relative to display's own true MPRT) will be the current default behavior of constant integer pixels per frame.

For non-integer-divisible:
Error margin for "Round Off To Nearest Integer Pixels Per Frame" would be the percentage difference of Target/Actual PPS speed.
Error margin for "Exact Motion Speed, Subpixel Rendered" would be the worst-case calculated scaling blur as a % of PPF
Error margin for "Exact Motion Speed, Jitter Allowed" would be the worst-case 1/PPF (1 pixel of jitter);
(Low frequency jitter visibly vibrates, while high-frequency jitter blends to additional motion blur)

For integer divisible:
Error margin is always 0 when PPS and PPF is exactly divisible integers (e.g. 960pps 4ppf 240Hz).

Behind The Scenes Article!

This article tells the scientific story of why 960 pixels per second was standardized:
Making Of: Why Are TestUFO Display Motion Tests 960 Pixels Per Second?
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Discorz
VIP Member
Posts: 999
Joined: 06 Sep 2019, 02:39
Location: Europe, Croatia
Contact:

Re: What are the actual UFO panning speeds?

Post by Discorz » 15 Sep 2021, 11:09

You must of already seen Ashun's (Aperutre Grille) last video where he mentioned all this stuff and he is the reason why posted this in first place. He was probably aware of this for a long time. Lack of actual panning speed indication can easily confuse people when comparing ufos at different refresh rates since slower speeds result in sharper motion and vice versa. 360 and 390 at 960pps was a good example. 360 was actually panning at higher 1080 and 390 at lower 780 which is a big 300pps difference. I also used to think 280 was step up over 240Hz because the ufo steering wheel was a bit more pronounced in motion.

Addition of tooltip-style box would be good for now. Other quick solution to this would be to put either approximate (∼) symbol or just state the actual pps speed. Leaving the drop down list presets as it is (or add "∼" too). When speed is exactly matching preset, approx. could be removed.

Image

Image
approximate
Image
exact

Another thing I was wondering; Should TestUFO round to closest higher or lower integer value in cases where both positive and negative have the same deviation? For example in case of 1080pps preset and 240Hz: at 5ppf actual speed deviates for +11,1% (1200pps) and at 4ppf deviates for -11,1% (960pps). I checked this and it rounds to higher integer.

I like the idea of custom settings for advanced users alot and hope some of it is implemented in future versions. VRR support would also be great. That might take a lot of work.
Compare UFOs | Do you use Blur Reduction? | Smooth Frog | Latency Split Test
Alienware AW2521H, Gigabyte M32Q, Asus VG279QM, Alienware AW2518HF, AOC C24G1, AOC G2790PX, Setup

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: What are the actual UFO panning speeds?

Post by Chief Blur Buster » 17 Sep 2021, 12:15

All good stuff -- very good implementation of pursuit testing.

I totally agree TestUFO rounding errors is unacceptable on a 390 Hz monitor -- I may expedite adjustment of this. I wish I could hire Ashun part-time to help me out here as there are so many projects at Blur Busters that are competing for my attention, and he ticks the checkboxes of skills that I need at Blur Busters.
Discorz wrote:
15 Sep 2021, 11:09
I like the idea of custom settings for advanced users alot and hope some of it is implemented in future versions. VRR support would also be great. That might take a lot of work.
VRR is easy to do with executable files. The problem is web browsers do not native support VRR.

Now, about Ashun's implementation of sync track is easier with some cameras but it is not as camera-flexible and display-flexible as the Blur Busters sync track (With trained interpretation). Fewer-stackings with no entire-sequence-alignment-verifications has more error margins than the lines pattern. The Blur Busters lines version is more camera-type universal & more accurate /for experienced users/, because it gives much more camera debugging information:

Pursuit camera lines temporal pattern is superior to flicker-based temporal test patterns because:
- More display-universal (e.g. DLP projectors that spread temporal color dither over more Hz); and
- Camera debugging is easier; and
- Keep of imperfect pursuit camera exposures is easier; and
- Easier to count how many refresh cycles were stacked; and
- Easier to measure total accumulated horizontal motion blur error margin. Blur Busters PCST can easily differentiate: 2,3,4,5,6,7 (all valid number of refresh cycles stacked). I will probably make future version configurable number of tickmarks, to optionally allow more Hz stacking for ultra-Hz use with temporal-dithered displays
- At higher Hz, more refresh cycle stacking is recommended to match human vision integration (and overcome camera noise during short exposures)

Ideally, I'd like to see Ashun add this sync track, and credit the research paper (academic etiquette), it can even be a ResearchGate/Academia.edu link instead of directly to Blur Busters (if he has a thing against linking to other businesses).

Crossposting a YouTube comments reply:
BlurBusters wrote:The lines version is more camera-type universal & more accurate /for experienced users/, because it gives much more camera debugging information:
https://blurbusters.com/wp-content/uplo ... elines.png
The Ashun version is easier for many smartphones and end-user pursuiting, but advanced reviewers familiar with line artifacts prefer the lines because of better visual pursuit debugging capabilities, and easier motion blur error margin measurement:
https://blurbusters.com/wp-content/uplo ... tation.png
Also, I should remind everyone that everyone is allowed to freely use the Blur Busters version of the 'lines' pursuit pattern, perhaps as a series of multiple selectable sync tracks. If Ashun wants, I'll eventually add support for the Ashun method as a pulldown list of multiple kinds of sync tracks, if people would like that. That said, it is not as visual-debugging-accurate as the lines version.
P.S. Yes, I'm aware of the speed inflexibility of TestUFO -- viewtopic.php?f=19&t=9075&p=71784
(This wasn't a problem in year 2012, so sorely needs to be upgraded for sure. Reviewers has also asked me to upgrade this, and a change is now in the pipeline.)

My fear is that a lot of inaccurate pursuits (with good Ashun squares) may be redistributed throughout the Internet, to demonstrate how terrible a display is -- because of some quirky behavior like stacked-refresh-cycle temporal dithering. So information about the pluses / limitations of various sync tracks should be publicly disclosed, since reviewers could post less-accurate pursuit photography because of sync track limitations.

Also multiple sync tracks on the same screen are recommended because camera can rotate and tilt during pursuit (smartphone hand wave pursuiting without a rail) -- which is why I like putting separate sync tracks at top and bottom, because of hand-wave pursuits can create inaccuracies near top, but not bottom, or vice versa (due to things like accidental camera rotation and tilting which create more accurate pursuit speeds for only one edge of the photograph.

Also since camera shutter is not sync'd with refresh cycles, it's fairly important to fight against this error margin via:
(A) stack MORE refresh cycles (e.g. 4 instead of 2), for improved WYSWIWYG where pursuit accuracy is feasible (small motion blur error margin over 4 exposures). Many LCDs look the same at just 2 exposures, especially 60Hz LCDs, but as refresh rates go higher, stacking at least 4 is recommended (sometimes even more)
(B) try to make camera exposure an even-numbered integer multiple of refresh cycles. That way it will catch equal parts of trailing edge of first refresh cycle and leading edge of final refresh cycle, including whatever temporals it does (scanout, dither, etc). Most dithering algorithms and inversion algorithms use an even-number multiple. But once you stack about 3-4, it starts to no longer be an error margin (for LCD). For DLP you may need more refresh cycle stackings depending on how much the DLP spreads temporal dither over multiple refresh cycles. One could just use a fixed human integration exposure (e.g. 1/30sec) regardless of Hz, and use a higher number of tickmarks for higher Hz and lower number of tickmarks for lower Hz. I did standardize on 4 because of all the compromises fighting against each other.|

This is why I say the lines version is more display-tech-universal including technologies that spread temporal dithering over many refresh cycles (such as ultra high Hz DLP). For example, the 240 Hz Optoma looks more accurate with a 4 refresh cycle stacking pursuit camera photograph than a 2 refresh cycle stacking pursuit camera photograph, and DLPs pretty much need an integer number of refresh cycles exposed (e.g. 1/60sec camera exposure for 240Hz), because of the temporal color behaviors and you want to capture the same number of red/green/blue flashes into a camera exposure ideally, as well as whatever temporal color depth is being spread over multliple repeat-instances of the color flash, as high speed color wheels only does a few bits per color revolution, and fewer color bits per Hz at higher singe-chip DLP refresh rates. Even at 480Hz, the Christie 3-chip starts to show some temporal issues with pursuits that only does 2 refresh cycles.

Pursuit camera becomes the most WYSIWYG when it's matching an average slice of average human integration time (e.g. 1/30sec). Although I did choose an arbitrary number of 4 refresh cycles, you can notice that in the above images, the Blur Busters sync track works very accurately from 2-refresh exposure thru about ~6-7-refresh exposure. LCD is relatively tolerant with fractional refresh cycle exposures, as long as at least 2 refresh cycles are exposed, since the LCD temporals are extremely minor (flicker from LCD inversion algorithm is a shallow 1%-league flicker), while certain DLP chips is much more intolerant.

Almost a decade ago when Blur Busters was just a hobby of mine, I thought deeply about multiple methods of pursuiting and I actually came across Ashun's method (flickering squares) in one of my prototype sync tracks. Not exactly as this type. There were huge pros but I had (at the time) felt the cons outweighed the pros, in terms of camera-debugging capability and display-universalness. Long term, I think there should be multiple selectable sync tracks in both TestUFO, Frog Pursuit, and all other implementations -- with pros/cons clearly illustrated. Perhaps a team-up article about it, who knows? There's a lot of knowledge gained over the years about manual pursuit cameras for display motion blur capture -- that may warrant a new paper. Maybe Ashun want to team up and join as a co-author?
Image

Image
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply