Why Are 30fps Stutters More Visible Today Than 30 Years Ago?

There are over 100 ergonomic issues from displays, far more than just flicker and blue light. This forum covers the giant variety of display ergonomics issues.
Post Reply
User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Why Are 30fps Stutters More Visible Today Than 30 Years Ago?

Post by Chief Blur Buster » 01 Apr 2020, 16:34

Crossposting here, because this is an obvious "Area 51: Display Science, Research & Engineering" Textbook study.
Aldagar wrote:
01 Apr 2020, 14:18
I wonder how consoles get around these problems. From what I've seen from Digital Foundry, many games fluctuate below 60 fps but don't suffer from neither tearing nor stuttering. I'm guessing they use some kind of triple buffering method.
They still stuttered with frame drops! I saw every single framedrop stutter them since 1990s. Even with triple buffering. It helped, it just didn't eliminate them. Not everyone was as picky about less-annoying stutters.

Stutters less visible at lower resolutions
But it was much less visible at 320x200 resolution, given our CRT tubes were small, and many games only 30fps.

On impulsed CRT tubes, single frame-drop stutters less visible at 30fps than 60fps
Also, 30fps-dropping-to-29.5fps is sometimes less visible than 60fps-dropping-to-59fps on a CRT tube, because you went from sudden brief singlestrobe-to-doublestrobe (jarring), instead of sudden brief doublestrobe-to-triplestrobe (less jarring), from diagram of duplicate images on impulse displays.

30fps-vs-60fps is much more visible today on bigger higher-resolution screens than yesterday 14-inch CRTs.

Normally, framedrops are less visible at higher Hz, but the multi-strobe effect of a fixed-Hz impulse driven display can amplify visibility of framedrops, since fps=Hz creates perfect zero motion blur (amplifying visibility of frame drops), and the double images of 30fps will better hide the framedrops at 30fps.


<Advanced Science Talk>
One part of the problem is the Vicious Cycle Effect that is currently amplifying visibility of stutters nowadays. Higher resolutions means a single stutter jumps over more pixels. Bigger displays means wider FOV which amplifies visibility of stutter-jumps. Brighter HDR means stutter visibility is amplified. Increasingly-clearer motion (higher refresh rates or better sterobing) means stutter visibility is further amplified.

They all amplify each other in a Vicious Cycle Effect. Retina-FOV (180+ degree) retina-resolution displays (16K+ if using 180+ degree FOV) can require quintuple digits (>10,000Hz) to reach retina refresh rates (Hz maxed to human perception), which is why refresh rate limitations are hugely amplified in virtual reality, Stroboscopic Effect of Finite Frame Rate Displays. Motion blur reduction modes helps (like LightBoost and ULMB strobing), but amplifies visiblity of stutters that is no longer hidden by motion blur anymore. And they also add unnatural flicker, and stroboscopic effects. But fundamentally, strobe-backlights are a humankind band-aid, because real life doesn't strobe. You have to emulate analog motion using ultrafine refresh cycles (ultra-Hz), to eliminate motion blur strobelessly, so that 100% of all motion blur is human-natural, and zero induced by the display itself. Emulating a Star Trek Holodeck successfully enough to pass a blind test (can't tell apart real life from VR) is very hampered by the Vicious Cycle effect, too.

Blur or stutters become fainter again. Then something changes (such as a FOV increase, or a resolution increase, or a refresh rate increase or or a HDR increase), and the artifacts or stutters become more visible again. Keep whac-a-mole by improving the other variables, but they still Vicious-Cycle into each other. See the conundrum?

Researchers keep writing very useful papers (whether the great Color MPRT paper that still sometimes neglects the human-visibility beyond the below-10% above-90% GtG Measurement Cutoff Threshold as we've noticed artifacts completely missed in GtG/MPRT tests Red Phosphor Interferes With Strobing, or the commonly redditted 255Hz Fighter Pilot Test (apples vs oranges -- it was limited-brightness single-frame identification test, not a continuous-blur test, nor a phantom-array test).

However, we back up and focus on the whole picture: What are ALL the weak links in a refresh rate?. We think of refresh rates differently, like geometry versus physics. The great academic writers are focussed on physics, but I think refresh rate as a temporal geometry problem in my human brain (much like a photogenic memory, but successfully predicting a refresh rate artifact), and come up with concepts missed by many.

There are many situations, you tell me a refreshing algorithm, and I correctly predict the resulting artifacts long before a display prototype is built. Happened repeatedly since 2012, all the way back to "LCD can't get CRT motion clarity" days -- till I mic-dropped with all the LightBoost talk back in the day, and with our good understanding of strobe-backlight artifacts (Electronics Hacking: Creating a Strobe Backlight (2012), and Strobe Crosstalk FAQ, and High Speed Videos of LCD Refreshing, and Red Phosphor).

That's why we have Blur Busters Law: The Amazing Journey To Future 1000 Hz Displays which have converged into practically unamious agreement by many researchers and acclaimed by reputable organizations (including NVIDIA) who enjoy using some of my articles as easy Cole Notes equivalents of complex technical scientific papers. Simple Blur Busters 101 stuff to my "Refresh Rate Einstein" brain matter -- but difficult concepts for a lot of people to grasp ("Why the hell do we need 1000 Hz?"). We even taught many new things to some displays engineers who are 90% correct but miss the 10% easily explained by Blur Busters, and have used our skills to properly complete their papers. ;)

Not everyone is picky about stutter, but others are picky, and when stutters are amplified, that can be a big problem. There are people who get nausea playing any FPS unless I've custom-tweaked a gaming monitor to their specific vision-sensitivity (that they didn't know about -- some have motion blur eyestrain, others have stutter nausea, more have blue-light sensitivity, etc. Everybody sees different). But there are also many who will never be able to play a FPS game or VR game nausea-free, until we're further along in the refresh rate race. It'll never be five-sigma population comfort for a long time. We'll have a century (or more) of whac-a-mole.
</Advanced Science Talk>


Now you know why I'm known as the Refresh Rate Einstein!
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Aldagar
Posts: 33
Joined: 12 Mar 2020, 14:27

Re: Why Are 30fps Stutters More Visible Today Than 30 Years Ago?

Post by Aldagar » 02 Apr 2020, 17:44

The ammount of things to take into account seems never-ending. I've already spent many hours reading your articles and I still have a long way to go to fully understand some concepts. But the positive side is that there's only room for improvement and we'll be seeing better screens the further technology advances.

In my case, excessive/unadjusted brightness and flickering/strobbing gives me eye strain (I know because my phone has an OLED screen with PWM dimming and I regret not getting an IPS one), blue light gives me insomnia, stuttering and high input lag gives me headaches (specially in fast paced games) and tearing and motion blur are just distracting but can worsen the gaming experience.

From what I've been gathering, it seems Freesync would be the most suitable solution for me, but I still haven't pulled the trigger. I don't play competitive games these days, I lean towards more graphically demanding single player games, so for now I'm contented with my 60Hz monitor, but it may be the next upgrade in my list.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Why Are 30fps Stutters More Visible Today Than 30 Years Ago?

Post by Chief Blur Buster » 03 Apr 2020, 00:45

Aldagar wrote:
02 Apr 2020, 17:44
From what I've been gathering, it seems Freesync would be the most suitable solution for me, but I still haven't pulled the trigger.
Yes -- VRR would probably your best solution (no strobing needed, eliminates stutter, still reduces motion blur at high frame rates).

You probably will function best with a very high-rated high-Hz IPS gaming monitor with VRR (FreeSync + G-SYNC Compatible Certified or better) would be right up your alley. They tend to have the least tradeoffs for jack-of-all-purposes needs and are a major upgrade from 60 Hz for even casual gaming experience.

If you have extra money to spend, you might want to consider HDR options (FALD monitors) for eye candy solo gaming. Be warned, those gets into he four figures.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply