Many Past Good Researchers Incorrectly Test Human Hz Detection Ability -- NEW Best Practices HERE:

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
Post Reply
User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Many Past Good Researchers Incorrectly Test Human Hz Detection Ability -- NEW Best Practices HERE:

Post by Chief Blur Buster » 14 Feb 2022, 21:54

Best Practices: Hz versus Hz Testing and Research
For new readers who's been arriving to Blur Busters and just reading recent peer-reviewed scientific papers... (I'm cited in more than 20 now!)

High Hz No Longer Just for Gamers Only
Smooth scrolling benefits from higher Hz, such as scrolling web pages! Obviously, experienced people, experienced computer optimizers (eliminate mouse microstutter with 2000Hz+ mice, etc), and experienced esports athletes will more easily see apart smaller Hz differences. But for the average iPad users only barely noticing 60Hz-vs-120Hz iPad scrolling, 60Hz-vs-240Hz web scrolling (1/4th the motion blur) is much more visible.

High Hz Becomes Cheaper/Free Over Long Term
Price differentials of entry level high Hz will disappear (e.g. 60Hz vs 120Hz will be minor like 720p vs 1080p vs 4K). Color screens were rare in phones once. Now they're all retina resolution and more frequently 90Hz-120Hz in many new phones. Even the latest PlayStation and XBOX consoles now supports 120Hz too, and larger numbers of model 2022 televisions are now including native 120Hz for free. So it becomes more of a free feature. Over the coming years, decades and century, the dominoes will fall (240Hz mainstream, 480Hz mainstream, etc), as there is visible humankind benefit.

GPUs Will Catch Up With New Frame Rate Amplification Technologies
GPUs are an obvious weak link, but inexpensive 1000fps+ will eventually be made possible with future Frame Rate Amplification Technologies -- NVIDIA is already working on it.

Retina Refresh Rate Depends on Display Resolution/Size/FOV, but Can Be >10,000Hz
The weak link for "retina refresh rate" (where human visible differences of refresh rates increases disappear) will be the highest number of all the above, aka 10,000Hz+. The limitations of refresh rates below 10,000fps 10,000Hz is only human visible in the extreme cases, like a Holodeck -- essentially a retina resolution VR headset -- e.g. wide-FOV 8K display such as a VR headset -- because of Vicious Cycle Effect (where higher resolutions and wider FOV amplifies refresh rate limitations), a chapter in the article, Blur Busters Law: Amazing Journey To Future 1000Hz Displays.

Geometric Upgrades in Hz is Mandatory for Human Visible Differences
Sensitivity of refresh rate difference diminishes rapidly, geometric upgrades of ~2x to 4x are needed for human visible differences (e.g. 360Hz-vs-1000Hz) for average non-gamer humans. Much like how resolutions needed to be geometrically upgraded to be really visible to the "I can't see VHS vs DVD" or "I can't see DVD vs HDTV" Average Joe User crowds. So you need GtG=0ms (or tiny fraction of a refresh cycle) AND increasing refresh rate ~2x-to-4x to be really blatantly human-visible (assuming no source material limitations, as explained in the Ultra HFR article).

Once you reach stroboscopic and motion blur weak links, larger Hz differences are required to see difference during highest resolutions (4K 240Hz is much more visible than 1080p 240Hz at same size/FOV). While you might not see 144Hz-vs-165Hz well, you'll see 240Hz-vs-1000Hz much more easily on a relative basis (assuming framerate=Hz) -- a far bigger refresh rate difference ratio.
  • ~2.0x Upgrades: 60 ➜ 120 ➜ 240 ➜ 480 ➜ 1000 Hz
  • ~2.5x Upgrades: 60 ➜ 144 ➜ 360 ➜ 1000 Hz
  • ~4.0x Upgrades: 60 ➜ 240 ➜ 1000 Hz
Variables for Maximizing Human Visible Hz Differences

Many weak links exist to diminish differences between refresh rates, including causes such as framerates not matching refresh rate, as well as microstutter. To maximize visible differences between refresh rates, these are required:
  • Framerate = Hz (very critical)
    (frame rates below Hz will add more persistence blur and/or stroboscopic effects and/or stutter. frame rates above Hz will add jitter effects or tearing effects due to motion-aliasing effects between frame rate and Hz. Even high-frequency judder can blend to extra motion blur -- akin to a fast-vibrating string turning into blur. 3:2 pulldown judder at 1000fps will generally have ~50% more motion blur than 1:1 pulldown. So always have precise framerate=Hz perfectly, to eliminate this extra-blur error margin from the stutter-to-blur continuum)
  • GtG = 0ms
    (eliminate GtG blur from affecting Hz differences. However, as long as GtG is less than a half a refresh cycle, GtG will usually have a negligibly visible effect. This is not the case for older 33ms 60Hz LCDs or 5ms 360Hz LCDs, though -- the higher the Hz, the harder it is for GtG to be fast enough to avoid diminishing Hz differences. Often this is mis-blamed in human inability to see high Hz, as a false scientific misunderstanding/misinterpretation. We can't tell you how often Blur Busters has to re-educate some longtime researchers to avoid incorrect untested assumptions that are tested elsewhere (Remember.. Yesterdays' CRT couldn't do 8K color, so it wasn't possible to test certain weak links). So don't make the mistake and create a flawed paper that gets called out for not acknowledging error margins -- refusal to acknowledge error margins is a big researcher no-no.)
  • MPRT = frametime
    (This assumes MPRT 0%->100%, rather than the standard MPRT 10%->90% explained in Pixel Response FAQ: GtG vs MPRT)
  • Source-based-blur = none
    (camera blur obscures Hz differences. If you're using camera material, please follow the important UltraHFR guidelines)
  • Compression-artifacts = none
    (compression artifacts obscures Hz differences)
  • FOV = as wide as comfortable
    (not a keyhole FOV like holding a tiny smartphone at full arm-length extension)
  • Stutter = none
    (including game stutter, and mouse micro stutters from using older gaming mouse running at only 1000Hz or less. There is a new scientific paper that shows jitter between framerate, refresh rate and Hz can be an issue)
  • Motion-speed = significantly faster in pixels per second than the Hz number
    (Framerate=Hz motion going 240 pixels/sec is human-visible 60Hz-vs-240Hz but very invisible 240Hz-vs-1000Hz, for both motionblur and for stroboscopic effect. This is because 240 pixels/sec is less than 1 pixels/frame at 1000fps 1000Hz, which will frequently be below human resolving capability. Also, the pixels per frame needs to be higher than human angular resolving resolution. Obviously, this is much easier with a 4K VR headset than with a 4K phone, due to much wider FOV -- since the pixels are much more spread apart and easier to resolve. This is the Vicious Cycle Effect in action: You need test higher resolutions to push retina Hz upwards. This is why it is a huge researcher assumption-mistake and incorrectly try to claim "Humans Can't See Over 200fps" if you're only testing yesterday's low-resolution VGA CRT or blurry-slow LCDs)
    Eye Behavior = must always be acknowledged in your paper
    (Stationary eyes versus moving eyes have very different behaviors on displays. Just see www.testufo.com/eyetracking and www.testufo.com/persistence animations, where displays look different for stationary eyes versus moving eyes. Hz differences for certain material may only become visible, and your test subjects might behave differently; one person may eye-track and the other person may not eyetrack. Designing your test to force eye-tracking or force stationary-eye, can improve consistency of test results. Alternatively, this is a definite proven error margin that must be acknowledged in your research paper.[/b]
Excellent Example of Researcher Error Margin Surprise:
Hz-vs-Hz Can Look Different For Stationary Eyes Versus Eye Tracking On Displays


If you don't believe displays look different for stationary eyes versus moving eyes, just look at this below animation below on a sample-and-hold LCD/OLED display. Look at the first UFO, then look at the second UFO. There. I've proven a frequently not-acknowledged error margin in Hz-vs-Hz human visibility test.

1. Look only at the stationary UFO for 10 seconds.
2. Look only at the moving UFO for 10 seconds
3. Observe how the background behaves differently at different speeds and different frame rates!



Some games force you to eyetrack (e.g. Rocket League with the flying soccer ball). Other games force you to stationary-gaze (e.g. staring at CS:GO crosshairs). Remember different games create different eye-movement behavior. You need to acknowledge as such, as Hz differences are more visible in some games than others, or in certain activities than others (e.g. web page scrolling -- some people habitually track eyes on scrolling, and other people stationary-gaze while scrolling -- some hate eye-tracking motion blur and automatically have a habit of stationary-gaze while scrolling on LCDs (to avoid being bothered by scrolling blur). So, remember.... this is an error margin that can affects whether Hz-vs-Hz becomes visible or not for a certain person for a certain screen activity.

We have relevant Blur Busters textbook reading on this topic matter at The Stroboscopic Effect of Finite Frame Rate Displays, a portion of the Research Area on Main Blur Busters Website

Acknowledging Displays & Other Related Equipment Is Imperfect

You will not always be able to maximize Hz differences because no perfect GtG=0ms LCDs exist, as an example. But you must, therefore, properly acknowledge your error margins in your Hz-vs-Hz research paper, and your known confirmed (non-assumed) efforts to control it.

Remember...Do not assume! Even 1000Hz gaming mouse can have microstutter even on 144Hz gaming displays. You want to get a ultrahigh-Hz mouse like Razer Viper 8KHz to avoid motion-aliasing between mouse Hz, framerate, and display Hz. If you're including mouse usage in your Hz detection threshold research, this is an important error margin -- mouse jitter has a diminish/distract effect for human ability to see refresh rate differences.

Image

Image

Image

But if you have major mouse jitter, it will make it harder to see 60Hz-vs-120Hz-vs-240Hz, if the gap size is randomized rather than regular. So, REMEMBER, the mouse is ALSO an error margin in Hz-vs-Hz tests (e.g. mouseturns in FPS games).

Suffice to say, this is a researcher wake-up call on how many researchers have made massively incorrect assumptions over the years, on Hz-vs-Hz weak links. Not just mouse -- but dozens of weak links that diminish Hz-vs-Hz differences, that are disappearing because better and better displays are becoming available. Blur Busters is far ahead on our ability to point out Hz weak-link knowledge -- So if you want us to vet your research plan/outline -- we'd be happy to do so (and we do sometimes. Feel free to contact me. Happy to help make your research more perfect / more peer-criticism-proofed at no charge.

Human Tests in Comparing Hz Differences Must Acknowledge Limitations Diminishing Hz Differences

Many scientists, media article writers and YouTubers test for Hz detection ability. Even standards organizations! Even people at Dolby, VESA, and other organizations consistently underestimate human's ability to detect Hz differences, especially by not maximizing the necessary variables above to consistently properly test Hz differences.

When scientifically setting up blind-test experiments, please acknowledge the limitations of your test equipment that often aberrate away from these ideal variables. Blindly loading up a random game with a random 1000Hz gaming computer mouse, is a scientifically-unknowledgable way to test Hz differences. If you want to test a Hz-differences trial in university, in media, in YouTube, or other scientific institution, please allow Blur Busters to vet the flaws and weak links of your test apparatus -- some simple changes (often done by the most professional esports athletes that know Blur Busters stuff properly) can often make major differences . We'd be happy to look things over -- usually for free.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Many Reputable Researchers Incorrectly Test Human Hz Detection Ability

Post by Chief Blur Buster » 14 Feb 2022, 21:55

Related Crossposts:
Chief Blur Buster wrote:
02 Jan 2022, 15:51
Many people misunderstand the different sensitivity thresholds, such as "Humans can't see above 75Hz" -- but that is only a flicker threshold. The purpose of this post is to show that there are extremely different orders of magnitude that refresh rate upgrades do address.

Even in a non-gaming context, one thing many people forget is that there’s many thresholds of detectable frequencies.

These are approximate thresholds (varies by human), rounded off to nearest order of magnitude for reader simplicity of how display imperfection scale.

Threshold where slideshows become motion: 10
This is a really low threshold such as 10 frames per second. Several research papers indicate 7 to 13 frames per second, such as this one. This doesn't mean stutter disappears (yet), it just means it now feel like motion rather than a slideshow playback.
Example order of magnitude: 10

Threshold where things stop flickering: 100
A common threshold is 85 Hz (for CRTs). Also known as the “flicker fusion threshold”. Variables such as duty cycle (pulse width) and whether there’s fade (e.g. phosphor fade) can shift this threshold. This also happens to be the rough threshold where stutter completely disappears on a perfect sample-and-hold display.
Example order of magnitude: 100

Thresholds where things stop motion blurring: 1000
Flicker free displays (sample and hold) means there is always a guaranteed minimum display motion blur, even for instant 0ms GtG displays, due to eye tracking blur (animation demo). The higher the resolution and the larger FOV the display, the easier it is to see display motion blur as a difference in sharpness between static imagery and moving imagery, blurry motion despite blur free frames (e.g. rendered frames or fast-shutter frames).
Example order of magnitude: 1000

Threshold for detectable stroboscopic effects: 10,000
Where mouse pointer becomes a continuous motion instead of gapped. This is where higher display Hz helps (reduce distance between gaps) and higher mouse Hz (reduce variance in the gaps). Mouse Hz needs to be massively oversample the display Hz to avoid mouse jitter (aliasing effects). If you move a mouse pointer 4000 pixels per second, you need 4000Hz to turn the mouse pointer into a smooth blur (without adding unwanted GPU blur effect).
Example order of magnitude 10,000

An example:
Image
(From lighting industry paper but has also been shown to be true for stroboscopics on large displays, including VR displays intended to mimic the real world)

More information can be found in Research Section of Blur Busters.
Chief Blur Buster wrote:
17 May 2022, 19:22
Remember, we've been cited in more than 25 research papers, so don't skip. We're already acknowledged textbook reading of Hz.

Introduction to newcomer researchers: Blur Busters is now famously an incubator of new display research ideas for aspiring display researchers. So listen-up, if you need to get A-grades in your thesis, or accolades by your co-researcher peers, or a promotion by your employer, or such. We are well known to have good observations for missed error margins in the display science, and the flaws of past research papers that overlooked mentioning error margins. Fear messing up peer review? Add us -- no charge for using us as part of your peer review team, even if it's just merely to help improve your error-margin acknowledgements section. We only ask for a citation. We do peer review nowadays, as unaffiliated independent researchers that have excellent eyes.

Cred: Remember, over 100 million people worldwide that subscribe/view over 500 content creators that use one of our display testing inventions, including the 14M sub that views LinusTechTips YouTube and 9M unique monthly viewers of RTINGS.com (and yes, they acknowledge us), only to mention 2 of more than 500 content creators that use our free display testing inventions. Even though you may never have heard of Blur Busters in your non-esports community, we are a significant nuts-and-bolts influence in the refresh rate race. Many researchers now get asked "has this been vetted by Blur Busters"? Make your paper more bulletproof, and thus more prestigious. Contact Us if you want us to be a free peer reviewer of your Hz/motion-related display research paper.


Properly Designing a Refresh Rate Blind Test For Average Population

Goal: Trying To Determine The Vanishing Point of Diminishing Curve of Returns for Refresh Rates

Short Summary of Hz-Amplifying Test Variables

Short Summary of Mandatory Test Variables that amplify the maximum retina refresh rate detectable by average human:

1. Perfect framerate=Hz (avoid jitter error margin)
2. VSYNC ON, not VSYNC OFF (avoid jitter error margin)
3. No control device jitter (e.g. don't use a mouse that's only 1000Hz. Read: Why?)
4. Test large 4x-8x geometric differences in refresh rates (e.g. 120Hz vs 480Hz, or 240Hz vs 1000Hz).
5. Ensure pixel response not the limiting factor, or acknowledge it in your Error Margins section
6. Fast motion speeds, that are still eye trackable
7. Test that force eye tracking (if testing via motion blur weak link)

(A) ALL OF THE ABOVE MUST BE TRUE for a sample-and-hold display (no flicker technology like CRT, plasma, BFI, strobe)
(B) Refresh rates benefits more than just games. 240Hz browser scrolling on sample-and-hold has 1/4th the motion blur of 60Hz.
(C) Also, re-read Blur Busters Display Research Portal, before you begin to design the test.
(D) The retina refresh rate will only be true for that specific display. Acknowledge that in your error margin section.

An 8K or 16K non-strobed VR headset will have a higher retina refresh rate than a non-strobed 1080p 24" monitor. So some of the maximally-possible test equipment will be unobtainium (despite having been proven by non-display means), so you will have to acknowledge technological limitations, and make sure to design your paper not to be misquoted by mainstream media.

All past Hz tests fail to simultaneously make 1/2/3/4/5/6/7 true, severely lowering the max Hz that the human eye could see.

As the advocacy leaders of the refresh rate race, remember to cite Blur Busters, or TestUFO, or "Mark Rejhon" one way or another in your research paper's references. As of 2022, more than 25 research papers cite me, or TestUFO, or Blur Busters now. There are many that conveniently forget to cite us despite having obviously been inspired by us, despite having picked our brains. I can help you follow your institutions' citation guidelines, with some generic permanent links such as www.testufo.com/mousearrow or www.testufo.com/map which is a perfect test case for retina refresh rates, too.

Long Summary of Test Variables & Considerations

Crossposted a reply of mine from a different forum:
ackmondual wrote:Besides video games, what other use cases are there for having a 480Hz display (for the sake of argument, let's assume that people can tell the difference and actually make use of the various refresh rates like 120, 240, 360, and 480)? I'm guessing for those that capture high frame rate video and want to show that off as movies, documentaries, what have you?
First, in the endeavour to determine the vanishing point of the diminishing curve of returns — you have to compare geometrically, e.g. 60Hz -> 144Hz -> 360Hz -> 1000Hz, or even larger 4x differences such as 60Hz -> 240Hz -> 1000Hz.

The thresholds where displays diverge from real life is in this heavily-upvoted earlier comment — it may be best to read that before reading the rest of this comment.

It’s important not to compare small Hz differences (e.g. only 240Hz vs 360Hz) — this commonly historically led to false assumptions of an early vanishing point of diminishing curve of returns.

In the diminishing curve of returns, everyday users, especially non-gamers, need to compare much larger Hz differentials (e.g. 120Hz vs 360Hz, or even 120Hz vs 480Hz) for a wider percentage of the population to easily see the difference in certain use cases. Motion blur differences (e.g. 240Hz vs 360Hz is only 1.5x blur trail size difference) are harder to see than flicker differences (ceases at ~70 Hz).

By using a 4x difference, the blur trail size is easier to tell apart (e.g. like the sports motion blur difference of a 1/120sec camera shutter photograph versus a 1/480sec camera shutter photograph)

Finally, I’ll address use cases.

Some use cases are ‘nice to have’ (e.g. browser scrolling clarity), while other use cases are more important long-term (e.g. making VR perfectly match real life with five-sigma comfort, without the use of current flicker/impulsing methods of motion blur reduction).

Many people don’t eye-track while scrolling or panning, like they used to do in the CRT days, so user habit may cause them to not notice. Scroll, pause, read, scroll, pause, read, etc. However, when someone is forced to (e.g. continuously panning map readability test, the blur limitations of current contemporary two-digit and three-digit refresh rates becomes apparent for sample-and-hold displays).

The way a test case is designed to force a person to eye-track fast-moving content, while comparing large Hz differences (~4x or more), moving at pixels per second at least twice as fast as the Hz — is a good blind-test demonstration of amplifying a 95%-population visibility for Hz diminishing-returns tests.

Everybody sees differently. Other people are more picky than others. But a random FPS game is not always the best human-population test, even if games a oft quoted benefit of high Hz (Which it is, but from testing population visibility, there are way better blind tests that reveal Hz differences much more massively). A properly designed test can reveal the differences much more clearly.

I am cited in, referred in, or a coauthor in more than 25 peer reviewed research papers, and many researchers now recognize me as a good test-variables specifier — e.g. describing test variables.

The wow factor varies a lot. Some go wow for 60Hz-vs-120Hz. While others go “meh” for 60Hz-vs-120Hz (2x), there are cases where they go “wow” for 60Hz-vs-360Hz (6x differential). In a test situation to amplify Hz differentials, the Blur Busters recommended testing variables are:

Now, if one needed to design a blind test that amplifies human-visibility of Hz differences even in non-game use cases:


1. Perfectly framepaced motion at framerate=Hz
Where frame rates are perfectly sync’d to refresh rates. This is what VR games do, because any form of jitter is much more noticeable in reality-simulation use cases, simulator screens, VR screens, etc. So you need to remove jitters from framerate-vs-Hz being out of sync. Even high-f
frequency jitters/stutters can blend to extra motion blur, much like a fast-vibrating guitar string


2. Use VSYNC ON or one of the new low-lag clones of VSYNC ON
Currently, VSYNC OFF adds microjitters that diminishes Hz differences. The goal is a test that tells Hz apart better. Such as VSYNC ON + NVIDIA Ultra Low Latency Mode, or the new “RTSS Scanline Sync” Commonly, VSYNC OFF Is used in esports for lower lag. However, VR never uses VSYNC OFF because VSYNC OFF adds jitters and tearing that distract from ability to tell Hz apart. Operating system compositors for scrolling/panning often are VSYNC ON, so this condition is met

3. No control device jitter weak links.
Example: If testing high-Hz displays via mouse panning, a good mouse pad combined with a poll rate above 1000Hz. A peer reviewed research paper confirmed that mouse poll rates need to be significantly higher than display Hz to avoid jittering/aliasing effects. Evenhigh frequency jitter blends to motion blur, interfering with ability to tell motionblur-related differences in Hz. In testing things like browser scrolling, holding the “DOWN” keyboard arrow will framepace the scrolling much better than dragging a scrollbar. Or even a smooth flick-scroll (like on a 120Hz iPad)

4. Blind test a large geometric difference in refresh rates.
Example: Skip comparing small differences in Hz, such as 144-vs-165 or 240-vs-280. Testing an average-population Hz noticeability test for determining a vanishing-point of diminishing curve of returns, requires large Hz jumps geometrically up the curve. Test 60Hz vs 240Hz. Or, test 120Hz versus 480Hz, or 240Hz versus 1000Hz. These differentials are noticeable to the majority of population, assuming all #1,2,3,4,5,6,7 for blind test-case variables are met.

5. Pixel response that is sufficiently fast
Many LCDs have slow enough GtG to obscure Hz differences. 60Hz vs 360Hz should be a 6x less motion blur, but due to slow GtG, it can appear as only 3x-4x more motion blur to most average users. As pixel response approach 0, then motion blur is simpler and linear following “1ms of frametime (refreshtime) translates to 1 pixel of motion blur per 1000 pixels/sec motion. (0ms GtG still has lots of motion blur, since 0ms GtG is not 0ms MPRT — there are two separate pixel response benchmarks)

6. Motion speed fast enough, but not too fast to eyetrack
A higher resolution display can make Hz difference tests easier. For example, 120Hz-vs-480Hz is much easier for the average population to see on a 4K display than on a 1080p display. However, we have few technological choices. The retina refresh rate of lower resolution displays is lower than a higher resolution display. Many past Hz tests did not account for this factor. As a rule of thumb, a motionspeed at least 2x higher (in pixels per second) than the highest refresh rate being tested — e.g. 960 pixels/sec motion being compared on any displays up to 480Hz. Faster motion speeds like 1920 pixels/sec on 1080p can be hard for some people to track, unless the resolution is also doubled — then there is more time to eye-track (and see imperfections from refresh rate) before the motion disappears off the edge of the screen).

7. A test that forces the person to eye-track
A great example is an infinite-scrolling test or a infinite-panning test. The motion never stops, so the user is forced to try to read/identify objects in the moving scenery/text/etc. This completely isolates the Hz-differential test to the persistence-based motion blur threshold, and essentially guarantees most of the population can tell apart 120Hz-vs-480Hz. Many gamers only stare stationary at a fixed gaze such as crosshairs, and sometimes can’t tell the difference as easily. Past Hz tests did not always factor for differences in user habit of eye tracking — displays look different for stationary eyes versus moving eyes. By isolating a refresh rate weak link into the human vision subsystem, people are denied the chance of rote habit (e.g. scroll, pause, read) and forced to read while scrolling/panning.

Operating system compositors are VSYNC ON, so for non-game apps — by understanding #1,2,3,4,5,6,7, we can identify test cases that most/all the above, assuming the GPU framerates can keep up:

- Holding the down arrow while trying to read the text in a very tall webpage.
- Trying to read street name labels of a continuously panning map (e.g. http://www.testufo.com/map), or dragging Google Maps with a high-pollrate mouse.
- Reading the contents of a continuously-dragging window.
- Rapidly looking for camoflaged details while continuously panning large gigapixel-size images (i.e. space imagery), especially with a high-pollrate mouse.

For game apps, Hz differentials amplify significantly in crosshairsless games (games that force you to eye-track all over the place) especially with a low-latency VSYNC ON equivalent in certain games if your GPU frame rate can keep up — and you also upgraded your mouse poll rate to at least 6x higher than refresh rate, you can also:

- Read nametags above a player in RTS while mid-panning (like DOTA2 animation simulation)
- Try to identify camoflaged enemies from the window of a fast-flying low altitude helicopter (e.g. Battlefield 3)
- Try to identify faraway enemies in an open-arena game (e.g. Quake 3 Arena) that forces you to keep turning/moving/strafing/etc to avoid getting killed. More eye tracking happens in arena games than, say, CS:GO where esports players can stare stationary at crosshairs, using peripheral vision for the rest of the screen (stationary eyes, even while running, strafing, and turning about).
- Head-turning in a VR headset while trying to read scenery signage. (All modern VR headsets use a low-latency variant of VSYNC ON because jitter/tearing is a difference to real life and adds nausea).
- Any FPS game that reliably does framerate=Hz VSYNC ON (not a common esports use case, the above are much more common use cases), and you eye-track objects in turns rather than stare stationary at the crosshairs.

This is only a limited list.

Some are utterly unimportant (especially to those who don’t have motion blur nausea), while others are a matter of simulation criticality (VR).

Today, the only way to reduce motion blur of current common frame rates, is the use of flicker methods (CRT, plasma, black frame insertion, impulsing, strobing, etc). Persistence motion blur of a display is connected to how long a pixel is visible for — a flash of 1ms (at a low Hz), or a full thousand 1ms frames (for 1000fps 1000Hz) if you wanted to avoid flashing/flicker methods.

This is irrelevant for many readers use cases, but that doesn’t dismiss the existence of blind tests that most of the population can tell apart high triple-digit Hz, as long as the test conditions match #1,2,3,4,5,6,7 to amplify the differences in the diminishing curve of returns — and there already exists everyday cases that exercise this.

Like many who don’t care about 3:2 pulldown, others do very much — and 4x+ Hz differences is actually (on average, for most ) much more noticeable than that (e.g. comparing 120fps-120Hz vs 480fps-480Hz), especially if the test case maximizes resolution to near retina as well too (e.g. 4K120 versus 4K480 display, or 4K240 display versus 4K1000 display).

Since 4K 1000 displays don’t exist outside the lab yet (e.g. monochrome 1920Hz DLP tests), it’s hard for researchers to design a test that maximizes Hz visibility via maximizing resolution. So test cases need to be designed around 1080p currently.

However, 4K does have a higher retina refresh rate than 1080p does at the same FOV, assuming the 4K angular resolution is still a human-visible improvement (it’s related — the vanishing point of diminishing curve of returns is related to the combination of maximum angular resolving resolution of the human, widest FOV for longest eye tracking time over that many pixels, fastest eye-tracking speed of the human, and a framerate=Hz high enough to cause sample-and-hold motion blur to disappear). Even for average humans, the "retina" refresh rate is capable of reaching quintuple digits in the most extreme test (e.g. if you had wanted a non-strobed 16K-resolution 180-degree FOV VR headset). In VR, the use of 8K spread over a full 180-degree FOV, does not always reach retina resolution when those pixels are enlarged that big, so the 16K and the ultra-wide FOV, really pushes up the required retina refresh rate that much higher -- it is a vicious cycle effect where resolutions and refresh rate amplify each other's limitation, if only one is raised.

For 24"-27" 1080p monitors -- the vanishing point of diminishing curve of Hz returns is still in the 4-digit Hz range for the average population for a surgically optimized test that is at least nominally representative of certain real-world situations. So we can still design a blind test, using the currently commercially available refresh rates, to reliably test Hz differentials for the average human population. Existing blind tests almost have never ensured 1/2/3/4/5/6/7 are simultaneously true.

Also, it’s worth noting that training a person to see 3:2 judder is easier than training a person to see 240Hz-vs-360Hz. However, a 4x Hz differential (240Hz-vs-1000Hz, assuming test variables simultaneously meet #1,2,3,4,5,6,7 requirements), 240Hz-vs-1000Hz certainly becomes much easier to see than 3:2 pulldown judder.
Even a 360 Hz monitor fails requirement #5, as pixel response for some color combinations is longer than 1/360sec. This can quite noticeably throttle the differences in Hz.

The GtG is slow enough, that a prototype 240Hz OLED has less motion blur than a 360 Hz LCD, and a 240 Hz OLED more linearly/perfectly follows Blur Busters Law (1ms of pixel visibility time translates to 1 pixel of motion blur per 1000 pixels/sec). Brute Hz can kind of compensate, but a 1.5x difference is completely leapfrogged by the pixel response difference of LCD and OLED. So in testing displays, GtG needs to be factored in, especially using much larger Hz differentials (e.g. 120Hz vs 480Hz may only be a 3x difference in blur instead of 4x difference due to LCD pixel response). The ideal display for Hz diminishing-curve-of-returns testing should have as close to 0ms GtG as possible, though -- commercial availability permitting, of course -- and acknowledged in the error margins section of any research.

There are many past tests (e.g. 240Hz-vs-360Hz, with jitter of VSYNC OFF, with jitter of 1000Hz mouse and slow pixel response that fail to consider the important test-case variables, when determining a vanishing point of diminishing curve of returns for all possible motion-related weak links a display can possibly have (versus real life).

Due to error margins, past “recent-ish” tests like 240Hz-vs-360Hz FPS tests actually can become literally only a 1.1x difference instead of the proper 1.5x difference, because of all the error margins stacked on each other (game stutter, mouse jitter, VSYNC OFF jitter, and slow GtG, combined). Whereas OS compositors can reliably run 360fps in browser smoothscroll, to the point where 240Hz-vs-360Hz is more only limited by slow GtG pixel response and not other jitter causes.

Esports athletes can see small Hz differences more easily, but average users would have a hard time doing so -- given sufficient Hz differentials (even for two Hz that are both beyond 240Hz). But that doesn’t mean more than 90% of avrage users can tell apart 240Hz vs 1000Hz, with a blind test designed around optimized variables #1,2,3,4,5,6,7.

P.S. If you haven’t seen it yet, check the Display Research Portal for both the peer reviewed content as well as the Coles Notes style explainers of the refresh rate race.

_________


Crossposted from an ArsTechnica comment, because this is relevant to 21st century researchers considering creating a new peer reviewed research paper.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply