Better BFI Algorithms (Split Blanking / Timings / Progressive / etc)

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
User avatar
Colonel_Gerdauf
Posts: 17
Joined: 24 May 2022, 12:24

Re: Better BFI Algorithms (Split Blanking / Timings / Progressive / etc)

Post by Colonel_Gerdauf » 28 May 2022, 10:25

My main point here is: if they are going to bring a remedy to an issue that lacks a proper solution at the time, then they need to be really careful about what kinds of compromises are being shoved to the average user. Sure, people in the know might be able to fine tune things, but it is not a solution at all for the average user who can actually get physically uncomfortable with the experience.

The people who prance around about "OLED-BFI being the ultimate solution" and how "sample-and-hold should have been obsolete and forbidden from use 5 years ago" need to understand that their ability to visually withstand the effects of flickering without problems is that of an extreme minority. Sadly, their attitude shows no sign that they will ever understand this much. This is de-facto cultism, and something that proves to me that BFI needs to die quickly.

The alternate systems like checkerboarding and phase timings I have provided... sure, they may significantly reduce the benefits (though I am strongly inclined to disagree), but why I strongly prefer these solutions is that the drawbacks are a lot more tolerable for most people. Sure, artifacting does not look pleasant, depending on implementation, but with such a high pixel density these days they are not too much of a concern. In addition, the artifacts are far more pleasant to look at in comparison to flickering. When you see artifacts, again depending on pixel density, you are going to get mildly disappointed. When you see flickering, your head will quickly be facing intense pain, and is actually one of my main citation points and demonstration examples on where and why I am hesitant on the push to "retina refreshes".

So, at the end of the day, specialization like that should never be on features themselves, but rather should come after customizing such features. It should be of zero surprise why VRR like G-Sync is getting a lot of praise and pushes for universal standardization, whereas tech like ULMB is quickly scorned and discarded as "pointless junk". You may disagree with this, but given the interesting situation with software caps and limitations, even with 20KHz displays being normalized, VRR will still be needed. I am very glad that NVidia has set the path forward with G-Sync, and likewise for Apple with ProMotion.

User avatar
Colonel_Gerdauf
Posts: 17
Joined: 24 May 2022, 12:24

Re: Better BFI Algorithms (Split Blanking / Timings / Progressive / etc)

Post by Colonel_Gerdauf » 28 May 2022, 12:48

I should also note; I have given the UFO test on BFI a try; on all settings that I could try, I have found the BFI based samples to be quite painful to watch, and in some cases it feels like the UFO is hopping from one part of the screen to the next. As a result, the BFI feels slower, one phase behind, the regular motion.

User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Better BFI Algorithms (Split Blanking / Timings / Progressive / etc)

Post by Chief Blur Buster » 04 Jun 2022, 21:09

Colonel_Gerdauf wrote:
28 May 2022, 12:48
I should also note; I have given the UFO test on BFI a try; on all settings that I could try, I have found the BFI based samples to be quite painful to watch, and in some cases it feels like the UFO is hopping from one part of the screen to the next. As a result, the BFI feels slower, one phase behind, the regular motion.
One note -- the TestUFO BFI demo is educational only, not for a comfort-test.

If you try to use it on a 60Hz display, it necessarily flickers at 30Hz -- which is too low.

Keep in mind you want something that allows you better than >60Hz BFI to stop the painful flickering. This is because software-based BFI is not capable of sub-refresh BFI like hardware strobe backlights.

You need a 240Hz monitor to emulate 120 Hz BFI (1:1 cadence) or 80 Hz BFI (2:1 cadence), both numbers above a typical human's flicker fusion threshold.

>120Hz hardware-based strobe backlights can be more comfortable than 50-60Hz CRTs (extra Hz headroom to minimize global flicker), which can be more comfortable than a 30Hz software BFI-simulator (too low Hz to be comfortable).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Post by Chief Blur Buster » 04 Jun 2022, 21:20

Colonel_Gerdauf wrote:
28 May 2022, 10:25
My main point here is: if they are going to bring a remedy to an issue that lacks a proper solution at the time, then they need to be really careful about what kinds of compromises are being shoved to the average user. Sure, people in the know might be able to fine tune things, but it is not a solution at all for the average user who can actually get physically uncomfortable with the experience.

The people who prance around about "OLED-BFI being the ultimate solution" and how "sample-and-hold should have been obsolete and forbidden from use 5 years ago" need to understand that their ability to visually withstand the effects of flickering without problems is that of an extreme minority. Sadly, their attitude shows no sign that they will ever understand this much. This is de-facto cultism, and something that proves to me that BFI needs to die quickly.
Unfortunately, there's no solution that bypasses BFI for some very important use cases such as virtual reality.

Also, moving the goalposts to call BFI cultism is a foul around here -- it neglects to understand some other essential use cases where BFI is far more comfortable than display motion blur.

The VR headsets, such as Valve Index, Oculus Rift, Quest 2, Pimax, etc, all strobe at between 90Hz-120Hz as a defacto BFI, because in a wide-FOV situation, motion blur created far more headaches than BFI flicker above 90Hz. This is 3x the flicker rate you probably incorrectly comfort-tested TestUFO BFI on a 60Hz monitor (30Hz flicker = visible flicker).

We enspouse "Right Tool For Right Job", with the correct scientific variables for the correct use cases.

60Hz BFI is very uncomfortable, but 120Hz hardware-based BFI is super-comfortable for >90% of population. If you have never seen 120Hz hardware-based BFI (it requires a 240Hz-or-higher monitor to simulate 120Hz BFI in software like TestUFO), then you made a scientific mis-conclusion, especially since you haven't stated the TestUFO BFI scientific variables you configured yourself to. If it said less than "80 fps" along the UFO BFI rows, then you haven't seen how BFI at 90Hz is much more comfortable than BFI at 60Hz.

The biggest controversy about BFI is merely simply because of 60 years worth of legacy 60fps 60Hz material (e.g. television, console, etc). But if you use 120fps game material, 120fps VR material, or 120fps HFR video material, using BFI on 120fps material is actually quite super-comfortable for motion blur reduction compared to 60Hz BFI.

The eyestrain of 120Hz flicker actually is less than the eyestrain of display motion blur for wide-FOV displays (60-degrees-and-wider of your human field of vision, like VR headsets), if you have not tested BFI where (frequency is at lest 90) AND where field of view (exceeds 60 degrees), then you have not realized the universal-agreement (>90% population) of the comfort of BFI in those use cases.

The abuse of "BFI is uncomfortable" claims comes from people who tried BFI at 30Hz or 60Hz, which is definitely over 100 times more uncomfortable than BFI at 120Hz. Thus, for an outsider, it becomes necessary to actually test different BFI frequencies at framerate=Hz at frame rates far in excess of 60. For software-based BFI, this is not possible with a 60Hz or 120Hz display (since a 120Hz + software BFI only creates 60 Hz flicker), so you need 180Hz minimum for a 90Hz-software-based BFI. The only way to bypass software-based BFI limitations is to use hardware-based subrefresh manipulation of the light emitted by the refresh cycles.

Also, a variant of your algorithm was already tested to have insufficient blur reductions to fix VR discomfort -- you need more than 80% display motion blur reduction, and none of your algorithms you posted achieve more than a few percent of motion blur reduction (almost indistinguishable from non-BFI). The artifacts generated by your algorithms were far worse than the non-noticeable motion blur differences.

The fact you left out scientific variables from your previous TestUFO BFI post, leads me to assume you tested only 60Hz display (for 30Hz software-emulated BFI) or 120Hz display (for 60Hz software-emulated BFI). The TestUFO BFI is simply intended to teach education on display motion blur physics in the abscence of an appropriate display, e.g. 30fps BFI has same motion blur as 60fps non-BFI -- and not for a comfort test. Now, keep in mind that hardware-based BFI can have more than 99% less motion blur than TestUFO BFI, because hardware-based BFI can reduce motion blur at subrefresh ratios.

The Oculus Quest 2 flashes only for 0.3ms per frame which means to do this with software-based BFI, would require a 3333fps 3333Hz display with TestUFO BFI doing 1 visible frame between ~36-37 black frames in between, in order to software-emulate via a sample-and-hold display. Hardware-based BFI can do it massively subrefresh leagues orders of magnitude motion-clearer than TestUFO BFI emulation. The fact that the Quest 2 VR headset flashes these 0.3ms frame 90 to 120 times a second, makes it flicker-comfortable to more than 90% of population. It's not five-sigma, but it's definitely not a cult of BFI because most users don't even know that all VR headsets ever manufactured are using BFI out of necessity. The fact that you're claiming this is a cult, is a foul, full stop. There is a threshold where the uncomfortable flicker starts to become far more comfortable.

Can you tell me how to reduce motion blur by 38x (e.g. 38 pixels of motion blur converted to 1 pixel of motion blur) without changing the refresh rate, and without adding a 1:37 ratio of visible:completedark? Scientifically impossible. Your BFI algorithm, assuming no addition of complete consistent black in addition to your existing algorithm, is more like 1:1.01 which is several orders of magnitude too weak for VR -- over 3000x weaker than Quest 2's flickerless (to most) BFI. Remember Quest 2 outsold the latest Microsoft XBox, with 14 million VR headsets sold (as of winter 2022), so it's not a flash in the pan -- and VR was uncomfortable until BFI was added, then VR became comfortable...

While I do enspouse giving flexibility for all users to BFI any refresh rate (including 60Hz), this is a user personal choice. While it's true less than 10% might find 60Hz flicker comfortable, the fact remains is that 120Hz jumps over the flicker-discomfort threshold to the pont where >90% find 120Hz flicker comfortable. So simply doubling Hz and doubling frame rate solves a lot of that "BFI discomfort" problem. But you can't test that using software-based BFI on anything less than a 240Hz sample-and-hold monitor, because BFI in software can only do it at frame intervals, not sub-frame intervals (That hardware easily can do).

We are just pragmatic where:
1. BFI is uncomfortable to most people at some scientific variables (e.g. 60Hz)
2. BFI is comfortable to most people at different scientific variables (e.g. 120Hz)
3. BFI is an absolute necessity for comfort in some technologies (e.g. virtual reality)

There can be far more BFI fanatics when it comes to things like the LG OLED TVs, but you never hear anything from BFI fanatics when it comes to ordinary modern virtual reality headsets (it's not possible to turn off BFI in any of the modern VR headsets, and you shouldn't -- the motion blur above-and-beyond real life is absolutely nauseatingly far worse than the flicker of >90Hz BFI).

-- Keep priorities straight
-- Understand BFI properly
-- Understand where BFI is good and where BFI is evil
-- Forest for the trees
-- Right tool for right job

This is just micdrop stuff. Put a fork in it, it's done. It's indisputable that BFI provides superior comfort in certain use cases (e.g. VR), as long as flicker frequencies are sufficiently high. Yes, it's only 90% of population, but that's more population than Real3D cinema glasses.

Yes, yes -- when it comes to VR use cases and VR flicker frequencies (90-120Hz), 90% of population being comfortable is not five-sigma. But more people today are more comfortable with VR than with Real3D cinema glasses -- VR has improved so massively that it's the world's most comfortable digital 3D (if you're wearing modern high-performance VR headsets), far more comfortable to the eyes than any 3D ever seen at the movie theater (even flickerless polarized glasses). That's an achievement unto itself.

Understanding the threshold where BFI is incomfortable, transitioning to where BFI is more comfortable.

However, the ultimate is blurless sample-and-hold (1000fps+ 1000Hz+), which is simultaneously flickerless and blurless. This is how you eventually five-sigma the comfort from 99% to all the way to >99.999% (five nines) -- real life has no flicker, real life has no frame rate -- so perfectly matching real life needs BFI to become obsolete.


If you don't understand this blur math / physics, you will be ill-equipped to reply to this post -- it is best to borrow some hardware-based BFI system that has access to framerate=Hz material at a flicker frequency of 90 or higher (so avoid 60fps television material -- can't be blur reduced by BFI without flicker). Borrowing a VR headset and executing some lite non-rollercoaster app such as a virtual vacation beach chair, kite flying app, fishing app, nursing-home telepresence such as Alcove app, or other VR app where you have 1:1 real life 6dof vertigo sync (i.e. "Comfortable"-rated VR games). If you narrowscope to such apps with 1:1 movement sync between real life and VR -- then most people are not bothered by these VR apps, no vertigo issue, no dizzy issue, no BFI flicker issue.
It's impressive that for a set of certain VR apps, they managed to make VR comfortable to roughly 90% of population. Head turns never has forced display motion blur, thanks to the mandatory permanently-enabled BFI as part of all VR headsets made in the last 5 years. But do you hear about BFI snobs in VR? No.

That being said...

We believe BFI will someday need to be obsolete. But it would require 3333fps 3333Hz to completely eliminate BFI. Your well-intentioned BFI algorithm, unfortunately would only reduce requirements (for same motion blur) negligibly such as down to only ~3320fps ~3320Hz because of the violation of the contiguousness of the blackness needed to reduce display motion blur to negligible necessary for use cases that try to simulate real life (VR) where you never want additional motion blur above-and-beyond natural human vision. The way Quest 2 does it, achieves the same motion blur of 3333fps 3333Hz sample-and-hold by using roughly a ~1:36 subrefresh pulse ratio (0.3ms flash per 1/90sec = 11.1ms refresh interval) -- something not possible to emulate via TestUFO software-based BFI on anything less than a 3333fps 3333Hz display (whether to natively do full framerate UFO, which would have the same motion blur as a 90fps BFI at 1 visible:36 blank ratio (requiring 90fps x 37 = ~3330Hz to perfectly match the zero-blurness of Quest 2 via completely software BFI methods). In other words, not possible to emulate 90Hz 90fps 0.3ms MPRT via software-based BFI on a sample-and-hold display unless the display supports well over 3000 Hz for TestUFO. So you certainly haven't tested TestUFO scientific variables necessary to match an Oculus Quest 2 that does massively-subrefresh BFI.

Be warned you might be one of the 10% who hate the VR flicker, but you will immediately notice it has better-than-CRT motion clarity in some of the headsets (e.g. Quest 2)

Nontheless, we someday want to get beyond BFI. That's why we're fans of brute frame rate at brute refresh rates, and will strive towards that over the long term. There is just no other way to do such massive amounts of blur reduction (sub-1ms MPRT).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Colonel_Gerdauf
Posts: 17
Joined: 24 May 2022, 12:24

Re: Better BFI Algorithms (Split Blanking / Timings / Progressive / etc)

Post by Colonel_Gerdauf » 04 Jun 2022, 22:50

BFI by itself is not cultism - that is not what I am saying; don't pull a strawman on me, please. What I was saying there is that the phrasing used by some of the BFI supporters suspiciously resembles phrasings that a cultist might use. In other words, I am accusing the people rather than the concept.

Sorry that I do not have some fancy equipment like a 1KHz display or a latency monitor, and that misses the entire damn point I am trying to make here. My point is exclusively on the basis of perspective of a regular user. For example, the physical tolerance of backlight strobing is very low - look at the reception of >120Hz ULMB for an example of this. A regular user does not care about some fancy equipment or mathematics; in fact they value simplicity and they have since the beginning of time itself. They want to turn on a display, and watch/play to their heart's content without having to worry about dealing with getting migraines within 5 minutes. That kind of situation creates a large return rate, which is never good for any business.

My base arguments currently stand, and about the tests, I did not try it at a 60Hz display; I tried it with overclocked displays, one at 68Hz, one at 75Hz, and one at 165Hz. Regardless on disagreements about whether 68 and 75 were good references in terms of raw measurements, they were valid as a point of comparison against 165. My experience contradicts this claim that BFI at a "suitable" base frame rate would be more comfortable versus one at 60. In order for the BFI strobing to become comfortable, going by your logic here, the strobe would need to be at a full 500Hz (480, strictly speaking).

And on that note, 60Hz BFI is basically the point of why BFI needs to exist in the first place. It is supposed to deal with visual content that cannot for whatever reason exceed 30FPS or god forbid 24FPS. Sure, you can say that 120Hz is more useful in those cases (1:4) including 60FPS-locked content, but isn't that by itself moving the goal post?

And what about the issue of content being constantly one phase behind? Does this not count as a major penalty when it comes to visual latency?

And thus I need to repeat my main question: if the BFI is only a suitable solution for the "lucky few" who are magically able to withstand the effects of flickering (again, confined to 24-30-60 where content is typically locked), then BFI is not really a solution in the first place, is it? At that point, interpolation nowadays is electronically easier and solves the issue without any of the negative side effects, not even latency. The only drawback is that it cannot be used where deliberately low-PPI content is played YET.

Yes, your ultimate goal is retina refresh, and sure I can be on board with that. But you need to acknowledge that it is a very long term goal, and far outside the context of discussion here.

User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Better BFI Algorithms (Split Blanking / Timings / Progressive / etc)

Post by Chief Blur Buster » 06 Jun 2022, 16:10

More fair points, but some to address:
Colonel_Gerdauf wrote:
04 Jun 2022, 22:50
My base arguments currently stand, and about the tests, I did not try it at a 60Hz display; I tried it with overclocked displays, one at 68Hz, one at 75Hz, and one at 165Hz. Regardless on disagreements about whether 68 and 75 were good references in terms of raw measurements, they were valid as a point of comparison against 165. My experience contradicts this claim that BFI at a "suitable" base frame rate would be more comfortable versus one at 60. In order for the BFI strobing to become comfortable
Can you specify your experience with hardware-based BFI at different flicker frequencies?

Especially zero/low-crosstalk implementations like LG C9 OLED's 120Hz 1:1 cadence-equivalent BFI (1/240sec blur) as well as Oculus Quest 2 90Hz ~1:36 cadence-equivalent BFI (1/3333sec blur).

On average, many hardware based BFI is much more highly configurable (at least by engineers) than software based BFI, so making comfort judgements based only on TestUFO software-based BFI is not ideal.

Everyone has different flicker sensitivities.

That being said, the 2-UFO version of 165Hz BFI would result in an 82.5fps frame rate. So testing only specifically this link and only on the 165Hz display with hardware strobing disabled (and not changing any TestUFO variable), you will observe that the 82.5fps UFO and the 144fps UFO has exactly the same motion blur. The last two rows out of three rows should have roughly equal motion blur. The second row of UFO is flickering at 82.5Hz.

If you can see 82.5Hz soft flicker of a 50%:50% duty (with LCD GtG softening of the squarewave) then you're unusually sensitive to soft flicker, since majority of population has a slightly lower flicker fusion threshold for this softness of 82.5Hz 50%:50% GtG-attenuated flicker.

There's already lots of scientific papers on flicker fusion threshold and flicker comforts -- from the discomfort of the flicker fusion threshold (at low flicker Hz) to the discomfort of phantom array effects (at high flicker Hz). Many are sensitive to only one or other other, not both.

Remember that no two persons sees perfectly identically -- slightly different color primaries, slightly different brightnesses, slightly different focus, slightly different flicker sensitivities. 12% of the population has enough deviation in their vision variables to be rated color-blind, for example, and more than 15% of the population wear glasses for example -- but this also extends to flicker sensitivities too. That's why you see many people not bothered by CRTs (of a specific Hz) but others were.
Colonel_Gerdauf wrote:
04 Jun 2022, 22:50
going by your logic here, the strobe would need to be at a full 500Hz (480, strictly speaking).
Forgetting to specify variables again -- the curve of tolerance/comfort starts to exceed 90% of population very early in the curve (~120Hz flicker). The curve is from super discomfort, to tolerance, to comfortable detectability, to unable to detect.
Colonel_Gerdauf wrote:
04 Jun 2022, 22:50
And on that note, 60Hz BFI is basically the point of why BFI needs to exist in the first place.
"why BFI needs to exist in the first place" --

Only from the "cultists" point of view. Let's consider the absolute scientific necessary of BFI in today's achievable equivalent of a Star Trek Holodeck (aka VR).

Let's focus on the market point of view (actual use of BFI), as well as the technologies that hit the market.

I must address that in the last few years ago years alone, BFI-enable is now more common in above-60Hz use cases. Let's consider the >14M Quest 2 headsets sold since inception (both Quest 2 and XBox sold a similar number of units for calendar year 2021, so VR is no longer as tiny a niche it used to be). Since 2020-2022, I would think less users in North America enabled BFI for 60Hz content, than uses BFI in VR, based on the numbers I have.

120Hz without BFI still has 1/120sec motion blur = 8.33 pixels of motion blur at 1000 pixels/sec

240Hz without BFI still has 1/240sec motion blur = 4.16 pixels of motion blur at 1000 pixels/sec

1000Hz without BFI still has 1/1000sec motion blur = 1 pixels of motion blur at 1000 pixels/sec

And you really need really good BFI to just move from 120Hz to 240Hz that involves a complete full contiguous 50% blanking of pixel -- not to say of the motion blur reduction needed to go from 120Hz-equivalent to 1000Hz-equivalent (while staying at 120Hz). So 1ms flashes at 120Hz is an 8x reduction of motion blur (where motion blur becomes 1/8th the original). But Quest 2 does even better -- motion blur becomes 1/37th the original! That's the magic of hardware-based sub-refresh BFI techniques! TestUFO BFI does only 1/2 (for 82.5fps at 165Hz), rather than ~1/37 that Quest 2 can do (90Hz native backlight-pulsed LCD). BFI blur reductions are massive enough to compensate for the flicker disadvantage when used in some use cases, which is what I've explained in earlier posts.

The magic of hardware-based subrefresh BFI is that you don't need retina Hz to pull pretty impressive BFI duty cycle ratios off since hardware BFI does not need to be integer-divisor based or refresh cycle based, as it can be any tiny contiguous subset of visibility (aka a brief pulse/flash of one refresh cycle). ViewSonic XG2431 Strobe Utility is capable of going from 1%:99% thru 40%:60% ON:OFF, in 1%-refresh-cycle steps. So the strongest setting is 1/100th the display motion blur at the same Hz, and the weakest setting is 40/100th the displa motion blur at the same Hz.

Now that being said...

Simulating a Star Trek Holodeck (aka VR) means the display must not have any imperfections (including motion blur) forced on your eyes. A VR head turn causes scenery to pan across the screen, creating display-forced motion blur above and beyond normal human vision.

A VR head turn on an 8K headset can be 8000 pixels/sec. So a 1000fps 1000Hz flickerless non-BFI display would have 1/1000th of 8000pixels/sec = 8 pixels of motion blur. So 1000fps 1000Hz is not enough to completely eliminate motion blur completely. But it could be "good enough" (the original Oculus Rift had 2ms of motion blur, before the newer Quest reduced that to 0.3ms of motion blur).

The lower the resolution of the VR headset, the less critical blur reduction becomes since you won't have as much gyrating of the difference between static resolution and motion resolution (depending on how much you move your head).

Visual content in VR still makes BFI absolutely mandatory even for a 240Hz VR headset. It is a use case that is more motion blur demanding, because the display has to more perfectly mimic real life. The pick-poison leans BFI to be more comfortable than motion blur, at today's limitation of contemporary triple-digit refresh rates.\
Colonel_Gerdauf wrote:
04 Jun 2022, 22:50
It is supposed to deal with visual content that cannot for whatever reason exceed 30FPS or god forbid 24FPS. Sure, you can say that 120Hz is more useful in those cases (1:4) including 60FPS-locked content, but isn't that by itself moving the goal post?
Your post (and your reply) keeps neglecting to specify variables, so I still call you moving the goal posts. If your posts were clearer on the scientific variables, you would have a legitimate case. Unfortunately, the generalism you put on BFI (including incorrect claims that 60Hz is the raison d'etre of BFI), still makes me call you out on moving the goal posts (yet again).

However, reading between your lines, your intent was to limit your previous context only to 60fps material. In which case, BFI is a really polarizing/imperfect weak band-aid on display motion blur in the absence of more-feasible options. I have no disagreement with that, but that is not how you framed your post, so I am leaping at the full flaws and mis-assumptions of your post.

I already mentioned that higher Hz displays will allow softer low-Hz software BFI simulations (e.g. phosphor fade simulators). But for now, TestUFO BFI demo is squarewave BFI (the worst kind, but also the ones that is easiest to emulate at the lowest possible Hz -- without needing a 1000Hz display).
Colonel_Gerdauf wrote:
04 Jun 2022, 22:50
And what about the issue of content being constantly one phase behind? Does this not count as a major penalty when it comes to visual latency?
BFI does not necessarily add lag. It varies by the BFI implementation -- CRT uses BFI but has zero lag. This is a BFI-implementation-specific matter for the software developers and firmware programmers (display tech), and besides the BFI-visual topic matter. We can split this BFI-latency topic to a separate subtopic if this is interseting to you -- but for now let's focus on the flicker part of BFI.
Colonel_Gerdauf wrote:
04 Jun 2022, 22:50
And thus I need to repeat my main question: if the BFI is only a suitable solution for the "lucky few" who are magically able to withstand the effects of flickering (again, confined to 24-30-60 where content is typically locked), then BFI is not really a solution in the first place, is it?
Correct.

For those who hated CRTs or double-strobe film projectors, there is no solution.

The best we can do is perfectly emulate the exact discomfort of that particular said CRT, or of that particular said film projector. But that's how people consumed 24, 30, 60fps matter.

We can most certainly (with enough excess retina Hz) simulate a yesteryear display even more perfectly, but we can't necessarily exceed how comfortable that same individual was in year 1975 with that exact same display.

(P.S. I already said this several times when I explained how imperfect BFI can be improved to match the exact comfort level that people used to have with a specific retro display. If you missed this point, you can scroll back to re-read the same points)
Colonel_Gerdauf wrote:
04 Jun 2022, 22:50
At that point, interpolation nowadays is electronically easier and solves the issue without any of the negative side effects, not even latency.
Interpolation has much more latency than even the most flawed BFI implementation.

However, new extrapolation/reprojection technologies can be a much more lagfree method of adding frames (especially for GPU-rendered controllered material, where the framerate amplification engine has non-black-box feedback from other sources such as motion vectors, the Z-buffer, and the high-Hz controller data.

Likewise, proper BFI algorithms exist that adds no extra lag over non-BFI. The way BFI works depends on how the BFI is implemented. BFI can be programmed in a digital rolling-scan manner similar to a CRT, allowing real-time sync with the signal streaming directly to the display, with an OFF signal trailing behind. This is the lagless form of BFI:

Image

Replace the Hz with 120 or whatever you wish. Or even add phosphor decay emulators (e.g. fading the trailing scanlines). Then with a realtime subrefresh per-scanline-based trailing phosphor fade simulator, this system would be exactly as comfortable comfortable as a retro CRT, as one example.

This system does not require retina Hz, if the phosphor-fade-emulators are programmed as a logic or linear electronics (even as a simple capacitor-bleed system), to at least perfectly match the comfort/discomfort of the said legacy display it's trying to match.
Colonel_Gerdauf wrote:
04 Jun 2022, 22:50
Yes, your ultimate goal is retina refresh, and sure I can be on board with that. But you need to acknowledge that it is a very long term goal, and far outside the context of discussion here.
It is not outside the context when I am replying to someone who says BFI needs to be eliminated. As a public personality, I am accountable to my research and even while I get peer-reviewed, I also peer-review others.

My impression from your posts is that you're overlooking how the BFI is a massive swiss army knife that has more than 100 attachments than you thought it had -- some superfluous, some bad, some good, some amazing.

Sometimes I am certainly imperfect and incomplete in my replies, so if I made assumptions, I'm happy to clarify them.

I will re-iterate my point, especially for other readers and researchers who are currently monitoring this thread. My goal is not only to reply to you, but to reply in a way for other geeks / researchers who read this post.

1. BFI is uncomfortable to most people at some scientific variables (e.g. 60Hz)
2. BFI is comfortable to most people at different scientific variables (e.g. 120Hz)
3. BFI is an absolute necessity for comfort in some technologies (e.g. virtual reality)

And secondly:

A. Everyone sees differently. Flicker, color, brightness, focus, etc.
B. There are definitely pros/cons
C. There are definitely use cases where BFI is essential (broad use case)
D. There are definitely people who want CRT motion clarity (increasingly niche use case)
E. Right Tool for Right Job

Which translates to:
I. Market use of BFI is evolving.
II. BFI for 60Hz content has recently become a niche use case
III. BFI for VR content has recently become a more semi-mainstream use case

Interpolation for sports is the mainstream use case of reducing sports motion blur (60fps content), while BFI is more purist/videophile, as a "pure" (but flicker-disadvantage-ridden method) method. This may be what you refer to as the cult area of BFI.

Lest people throw an excuse and say VR is niche -- let's compare it to another niche called "gaming consoles" for apples vs apples at least in my jurisdictions. Let's not forget that the 8.8M Quest 2s versus 10.3M XBox (calendar year 2021, not since inception -- practically reaching the same ballpark of popularity/niche (whether a human wishes to define consoles as niche or as mainstream -- the numbers are now similar in some countries).

From my research, intentionally-enabled BFI at 60Hz is a fairly niche / videophile feature. It deserves to exist, but not everyone likes it, while others absolutely adore it as the lowest-lag lowest-artifact method of reducing display motion blur (compared to other middle-black-box methods such as television-based interpolation, which is practically almost always laggier than BFI, unlike GPU-based interactive-feedbacked framerate amplifiers like Oculus ASW or NVIDIA DLSS)

Additionally, Blur Busters plays a role in wanting to speed up the refresh rate race to retina refresh rates where possible. There is already multiple engineering designs at multiple companies for niche 8K 1000Hz displays buildable before the end of decade.

Since these are active projects well under way, this is within the business purview of Blur Busters, and since Blur Busters puts food on the table, it is existing work for us, not distant-future stuff for us. Even if it's just niche systems (like 4K was in 2001), it still is an active area of research.

Let's consider the newly launched 500Hz panels that are about to hit the market as an example of near-retina-refresh-rate.
Also, it is worth noting that 360fps 360Hz as well as 500fps 500Hz starts to look within a stone-throw of retina refresh rate, at least for 1080p content that does not have motion faster than about 500 pixels per second. That being said, LCD GtG is a bottleneck. So even though we need a few more doublings to truly retina-it-out, it's already reaching the ballpark, as long as you have material hitting such high frame rates (e.g. 500fps). 60fps-to-500fps reduced display motion blur from 16.7ms to 2ms, so any further refresh rate increases is splitting those 2ms of blur even further -- but at least it's only 6x worse than a Quest 2 (0.3ms blur) without needing any form of BFI. Obviously, 500Hz is much further from retina refresh rate if you use a higher resolution, bigger screen, wider FOV, and faster motions. But at 1920x1080p, and for material don't pan faster than 1/4th screenwidth per second , and for such panning velocities and fast camera shutter speeds, it might as well be practically retina.

Being that said, this is just somewhat (but reasonably) understating scientific variables to bring 500Hz closer to retina, but it's also a good testing platform for interpolators (e.g. https://www.svp-team.com/ probably hasn't been tested with realtime 8:1 interpolation ratios on 480Hz yet but it would be cool to try), as well as being a test display for upcoming near-term CRT electron beam simulators (there's an upcoming software based TestUFO CRT electron beam simulator, that will only automatically enable itself only on a 240Hz+ screen, while automatically recommending 360Hz+ ... and an emulator author is interested). Even so, subrefresh electronics would be simpler from a display engineering perspective like the OLED scanout diagram above, although brute Hz permits generic software based approaches to be more feasible.

And since my observations of how more Hz helps more accurate configurable BFI (that approache closer to the comfort of the retro displays) is relevant. Thus, this is not out of scope. In addition, please be noted how I strategically reply to threads in Area51, because researchers view these forums too, and it is my job to accurately rebuttal where applicable -- especially when variables are omitted or assumptions are made.

If we were to restart the topic and narrowscope it only to 60fps content, then some better merit can be had. However, even with that, I still maintain user choice of 60Hz BFI should be provided, and that I believe in all parallel technological efforts (hardware-based and software-based). But there was some scorched earth comments about BFI in general, and I must come to an equally aggressively proportional scientific defense -- even if it's slightly sidetopic -- because this forum is held accountable by other researchers. After all, my forum posts have sometimes tipped dominoes towards real scientific research papers in the past.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Colonel_Gerdauf
Posts: 17
Joined: 24 May 2022, 12:24

Re: Better BFI Algorithms (Split Blanking / Timings / Progressive / etc)

Post by Colonel_Gerdauf » 09 Jun 2022, 17:13

I have already provided, hence the original point of this thread, solutions to several specific points of failure with the standard BFI that is still all around us (whole screen approach that is clocked on post-frame). These are not solutions I have pulled out of my butt either as I had tested them myself, especially the first 50% solution, and I found them to be quite satisfactory in both the reduction of motion blur while at the same time having the negative effects reduced to something actually tolerable.

Do not forget that there is retina resolution alongside retina refresh, which is actually why I am very annoyed about the seeming death of 24 inch 1440p displays and the like. As it stands, the push for the best in HFR is still coming at the cost of resolution and especially pixel density. Anyways, in respect to retina resolution, we are already at the point where pixel level artifacting will be very hard to notice. You would be very hard pressed to notice the effects of pixel shifting on OLED and plasma, as an example.

On the note of hardware BFI, that is really only of concern as far as I can tell on LCD-type panels that have a separate backlight layer. But I find the benefits there quite questionable, especially considering how jarring flickering lights (in general) are to every human being living in planet earth. From how I am seeing things, maintaining the backlight at least maintains the illusion that not much is happening.

But you are telling me that the motion blur reductions of my solutions are neutered to the point that it does nothing at all. My personal observation still strongly disagrees with that, but the technical solutions that you have brought actually does very little to solve the original problems I had pointed out at hand. Which leaves us at an impasse and no real solution.

When the common thing from the bigger industry is to push forward with a solution that can cause physical and borderline medical problems, and no consensus can be reached for a alternate solution that limits the actual problem at hand, then we have another god damn "tech advancement" that I am forced to stomach. There have been so much things pushed to the general public with zero care as to the actual feedback as to what they actually want, and in all of them the original option was in time robbed as a choice from the actual majority. From the earlier disregard of actually good pixel densities, from linear key-switches being the "best" at some imaginary thing, from smartphones becoming behemoths because Samsung said so, from QLC NAND, from this recent nonsense about "ergo chairs"... and it goes on. If it actually reaches a point where BFI is practically enforced on a display AND operating-system level, and no actual flexibility is provided in timings and VRR compatibility, and it remains even on kilohertz displays, I am temped to give up on tech at this point.

Business people and engineers alike are disgustingly inept when it comes to the human and social factor of anything, and neither of them have any willingness at all to actually study and learn of the complexities of humanity. I have had enough personal experience to confirm this.

RonsonPL
Posts: 122
Joined: 26 Aug 2014, 07:12

Re: Better BFI Algorithms (Split Blanking / Timings / Progressive / etc)

Post by RonsonPL » 11 Jun 2022, 05:02

May I chime in?
Colonel_Gerdauf wrote:
09 Jun 2022, 17:13
seeming death of 24 inch 1440p displays and the like. As it stands, the push for the best in HFR is still coming at the cost of resolution and especially pixel density.
I disagree. The monitor and LCD panel manufacturers don't a give toss about framerates. Increasing the panel speed from 60 to 120Hz doesn't add all that much to the cost. What prevents you from seeing 24" 1440p monitors is that monitor manufacturer cannot milk you with insane prices after slapping a "gaming" label on it. We have PS5/Xbox Series consoles now. People prefer 4K. If they want a gaming monitor, they either cannot afford anything on which the manufacturer could profit a lot on, or they tend to look for bigger screen sizes. That's also why you don't see TN panels among gaming monitors with 1440p and 4K 120+ panels. They just don't want to bother if they can keep selling insanely overpriced models, by just adding some RGB lights, more horizontal screen size, stronger backlight to slap "HDR" on it etc. You can find 5K monitors. It's not like framerates are holding it all back. Displayport 2 port is coming. HDMI 2.1 is out. There's no problem here.

On the note of hardware BFI, that is really only of concern as far as I can tell on LCD-type panels that have a separate backlight layer.
No, not really. Check this https://blurbusters.com/faq/oled-motion-blur/
But I find the benefits there quite questionable, especially considering how jarring flickering lights (in general) are to every human being living in planet earth.
This is like saying everyone should keep improving headphones cause you find dedicated speaker systems for home audio to be too heavy to carry. If you play Chess or old-school point & click games, sure. But there are people out there, like me, who really value clear image in fast (even in very, very fast) motion. I can understand your agument. 120Hz flicker doesn't bother me but is not healthy for my eyes and brain. And it will make me feel tired or give a headache after 4 hours of continuous play time. I can also agree that 60Hz flicker is horrible and depending on the content, strobe duration, visible FOV, brightness on surrounding items/wall etc. can be from annoying to useless. But even if I can easily tell you if your CRT monitor runs at 85Hz instead of 120Hz, even if I can easily tell you 100Hz mode is enabled on an LCD, which means I'm not that immune to flicker, I still choose to play many games at 60Hz strobed mode and don't even want to bother with anything which would degrade the motion. Sadly, there are no alternatives. As Chief already explained - it's physics. You can't go around it. Either you get the flicker or clear image in fast motion. There's no way to get both at the same time without interpolation to ultra HFR, which is currently not available and doesn't seem like it will be anytime soon. Even then, it may be difficult to have it working in emulators and console games (and other games stupidly locked to 60fps)
For me, there are many games I love which simply lose 90% of their appeal without clear motion. For those locked to 60Hz it means I cannot play over 15 minutes without starting to feel some eye strain. I absolutely have to get a break after 20-25 minute. But even if I can only play one track or level for 10 minutes between taking breaks, I still prefer to play it like this, instead of not playing it at all, and after trying multiple times, I confirmed that playing the same game with blurry motion makes me bored after 2 minutes and I don't enjoy the game at all. It is not because I'm so unique and that motion bothers me more than it does bother other gamers. I've spent hundreds of hours analyzing the practical side of motion clarity in relation to my life long hobby which is gaming, and I would argue that my point of view differs from the majority mostly because of what I know, and what I am aware of. This topic touches the psychology, surprisingly, which means "learning" about it is not simple. There are many pitfalls and it's very easy to draw false conclusions.

But you are telling me that the motion blur reductions of my solutions are neutered to the point that it does nothing at all. My personal observation still strongly disagrees with that, but the technical solutions that you have brought actually does very little to solve the original problems I had pointed out at hand. Which leaves us at an impasse and no real solution.
Again. Chief tried to explain that you cannot bend physics. You either get the cleaer motion or no flicker. What Chief included in his response is all you can do to mitigate the issue. With selected displays, we can adjust the compromise as we wish, between the blur and motion clarity, by adjusting strobe lenght. That's the maximum compromise we (people who actually want clear motion) can accept. Anything which makes the image just less blurry, but still blurry or wrong, is simply pointless. Just like ideas like "let's strobe twice" introduced by some (too many) monitor manufacturers.

Please read his articles on Blurbusters.com to learn why physics create the obstacle.

When the common thing from the bigger industry is to push forward with a solution that can cause physical and borderline medical problems, and no consensus can be reached for a alternate solution that limits the actual problem at hand, then we have another god damn "tech advancement" that I am forced to stomach.
If you mention medically harfmul thing that industry pushes as an advancement, suddenly three letters come to my mind. HDR. People are damaging their eyes, reviewers and youtubers praise the displays while saying "oh, this is too much, this hurts my eyes". Human eyes need time to adjust to higherr brightness and in real life. It's usually the whole FOV which is affected. Playing on a super bright HDR display in a dim room, with sudden changes in brightness covering just a small portion of the FOV, is hurting the eyes. Look how much the industry cares.. :(
If they push this stupid HDR into VR without proper algorithms acting as a guardian for the eyes, I predict a huge wave of lawsuits 10-20 years from now.
Flicker compared to this is nothing.
There have been so much things pushed to the general public with zero care as to the actual feedback as to what they actually want, and in all of them the original option was in time robbed as a choice from the actual majority. From the earlier disregard of actually good pixel densities, from linear key-switches being the "best" at some imaginary thing, from smartphones becoming behemoths because Samsung said so, from QLC NAND, from this recent nonsense about "ergo chairs"... and it goes on. If it actually reaches a point where BFI is practically enforced on a display AND operating-system level, and no actual flexibility is provided in timings and VRR compatibility, and it remains even on kilohertz displays, I am temped to give up on tech at this point.
I would agree the tech industry disappoints. I am forced to give up on gaming industry as it went so bad there are close to no new games I'd like, and I'd rather play old games as they have aspects I like and miss in modern gaming. Not just nostalgia, mind you, although surely this plays some role. So I can understand your frustration.
That said, I have a 180° different opinion. It's the VRR which I see as trojan horse. The tech pushed over BFI, the tech which slows down the popularization of BFI and the tech which makes it harder to get displays with BFI implementation which is actually usable. The tech which benefits the monitor/TV manufacturers more than it does benefit users. It's used to rise the price upwards. For example, you could get a 250€ 1080p 144Hz monitor. Then it was removed from the market and replaced with the same thing, but with VRR. But now it was 400€.
I see the benefits of VRR as not even in 10% as important as motion clarity. The idea that BFI is overly pushed baffles me. Sorry, I just don't see it.

Business people and engineers alike are disgustingly inept when it comes to the human and social factor of anything, and neither of them have any willingness at all to actually study and learn of the complexities of humanity. I have had enough personal experience to confirm this.
Well, yes. I wouldn't call them inept. It's just what the boss tells them to do. They're busy. They won't waste 5 minutes on a thing for which they are not payed.
Nintendo Switch doesn't have BFI despite the fact the flicker is not as visible on a small screen.
Faceboolus Quest 2 doesn't have a PC port.
New LG OLED panels for 2022 don't have the proper BFI customization menu while the older line (2021) had it.
PS5 doesn't allow the user to disable automatic low latency mode, which DISABLES the BFI option on most TVs (this is simply insanely stupid)

and so on, and so on..

I feel you. We disagree, but we're angry at similar things. At least it's something ;)

User avatar
Colonel_Gerdauf
Posts: 17
Joined: 24 May 2022, 12:24

Re: Better BFI Algorithms (Split Blanking / Timings / Progressive / etc)

Post by Colonel_Gerdauf » 11 Jun 2022, 11:01

RonsonPL wrote:
11 Jun 2022, 05:02
I disagree. The monitor and LCD panel manufacturers don't a give toss about framerates. Increasing the panel speed from 60 to 120Hz doesn't add all that much to the cost. What prevents you from seeing 24" 1440p monitors is that monitor manufacturer cannot milk you with insane prices after slapping a "gaming" label on it. We have PS5/Xbox Series consoles now. People prefer 4K. If they want a gaming monitor, they either cannot afford anything on which the manufacturer could profit a lot on, or they tend to look for bigger screen sizes. That's also why you don't see TN panels among gaming monitors with 1440p and 4K 120+ panels. They just don't want to bother if they can keep selling insanely overpriced models, by just adding some RGB lights, more horizontal screen size, stronger backlight to slap "HDR" on it etc. You can find 5K monitors. It's not like framerates are holding it all back. Displayport 2 port is coming. HDMI 2.1 is out. There's no problem here.
You are describing all of this as a matter of marketing. In that respect, there are other issues that you yourself have acknowledged, and I will get to those later. But this issue is not specific to "gaming displays". Look around, and you will notice that 24" 1440P is very hard to find as a whole, even in the enterprise, budget/scam, and general markets, and those that are there are ancient panels on clearance. The taste for IPS/VA is understandable to a point, but we are getting to the point in refresh where you cannot actually push refresh and response times any further. The general G2G for IPS is around 7ms. G2G is not a fantastic measurement by the way of how fast pixels actually change states, but without any common ground here it is the best we have. Regardless, refresh rates do not work by magic, they rely on the limits of how fast pixels can fully change, and the rate of speed that would encompass a 120Hz display is 8.33ms. 144Hz would be 6.9ms, 165Hz would be 6ms, 240Hz would be 4.2ms, etc. I think you can see where this becomes weird. The HFR displays using IPS ultimately use quite a bit of number-fudging and masking tricks to make it appear like IPS can do everything when it has clear limits.

This was a part of my hesitation towards the push over retina refreshes, as in order to actually make kilohetz displays, you would need a brand new display tech that can surpass the pixel limits of even TN.

And I have not even gotten to the specific limitations of VA or IPS, which can ruin the intended experience and put things into a matter of drawing a lottery. It does not need to be this way, but QA is notoriously inconsistent with strange criteria.

Like I mentioned, there is more to a good display than having good refresh, or even good resolution in isolation. If you are going to shove in my throat the "super duper fast" displays which have the comical 95 PPI, then you have wasted my time; it is as simple as that. Apple got the message here loud and clear with iPhones and Macs, and the seemingly absurd PPI on the Macs ends up highlighting how much work needs to be done in both Windows and Linux (the latter being a lost cause for the last point here) in regards to proper element scaling. With smartphones, many originally tried to one up Apple, only to face the realities of how little value their vendettas bring to the table.

Sure, push for higher resolutions and higher refreshes to your heart's content, but at the very least keep retina resolutions in mind!
This is like saying everyone should keep improving headphones cause you find dedicated speaker systems for home audio to be too heavy to carry. If you play Chess or old-school point & click games, sure. But there are people out there, like me, who really value clear image in fast (even in very, very fast) motion. I can understand your argument. 120Hz flicker doesn't bother me but is not healthy for my eyes and brain. And it will make me feel tired or give a headache after 4 hours of continuous play time. I can also agree that 60Hz flicker is horrible and depending on the content, strobe duration, visible FOV, brightness on surrounding items/wall etc. can be from annoying to useless. But even if I can easily tell you if your CRT monitor runs at 85Hz instead of 120Hz, even if I can easily tell you 100Hz mode is enabled on an LCD, which means I'm not that immune to flicker, I still choose to play many games at 60Hz strobed mode and don't even want to bother with anything which would degrade the motion. Sadly, there are no alternatives. As Chief already explained - it's physics. You can't go around it. Either you get the flicker or clear image in fast motion. There's no way to get both at the same time without interpolation to ultra HFR, which is currently not available and doesn't seem like it will be anytime soon. Even then, it may be difficult to have it working in emulators and console games (and other games stupidly locked to 60fps)
Once again, the scope of source material needs to be taken into consideration. Where BFI would have helped is the lots of content which are capped to 24 30 and 60 FPS, as everything else is variable (thus VRR being the actual holy grail) or at the very least capped high enough where BFI would not be necessary.

Those would involve the balancing act between "tolerable" flicker and the noticeable reductions of motion blur. You can get around the original issue here by simply using higher clock ratios such as 1/4 or 1/6, but it gets to the point where the reduction gets "neutered" so that is a moot point. On that note, why exactly are the blanks clocked? That was never sufficiently explained to me. It would have been far easier to fine tune, and electronically more simple, to have the blanks timed in relation to the active frame, and have the blank done at the start of the frame.

Image
---
Image
Image
Again. Chief tried to explain that you cannot bend physics. You either get the clearer motion or no flicker. What Chief included in his response is all you can do to mitigate the issue. With selected displays, we can adjust the compromise as we wish, between the blur and motion clarity, by adjusting strobe length. That's the maximum compromise we (people who actually want clear motion) can accept. Anything which makes the image just less blurry, but still blurry or wrong, is simply pointless. Just like ideas like "let's strobe twice" introduced by some (too many) monitor manufacturers.

Please read his articles on Blurbusters.com to learn why physics create the obstacle.
There is physics, and there is the user experience. Physics is static, meaning that there is always a workaround to a problem somewhere. User experience is a mess that requires proper social awareness at the very least, to sift through. I will get to the latter point later. If it is truly a matter of "eye hurt, or motion blur", then that is about the most idiotic kind of compromise to make, and especially foolish when it is being shoved to the general user.
If you mention medically harmful thing that industry pushes as an advancement, suddenly three letters come to my mind. HDR. People are damaging their eyes, reviewers and youtubers praise the displays while saying "oh, this is too much, this hurts my eyes". Human eyes need time to adjust to higher brightness and in real life. It's usually the whole FOV which is affected. Playing on a super bright HDR display in a dim room, with sudden changes in brightness covering just a small portion of the FOV, is hurting the eyes. Look how much the industry cares.. :(
If they push this stupid HDR into VR without proper algorithms acting as a guardian for the eyes, I predict a huge wave of lawsuits 10-20 years from now.
Flicker compared to this is nothing.
In a technical level, I very strongly disagree. First, the thing about HDR is not about being super-bright. When it talks about nits, it is speaking of the relative vibrancy values from section to section. Think about it like the power rating for a computer PSU; it is only talking about the maximum that can be reached when/where is needed. This is actually really close to real life, as when you walk outside, there are sharp differences in "brightness" everywhere that is not properly captured in a regular display. Looking at the real world hasn't bothered people, so why should HDR?

Image
Image

As with the searing brightness, that is down to faulty implementations in case situations of specs versus the content on display, and that can be easily adjusted on a hardware as well as OS level. In fact the issue on Windows has been dealt with thanks to AutoHDR, as HDR mode is no longer static when turned on, it can dial itself down when looking at SDR content, even on a pixel-to-pixel basis. In addition, the parameters can be tuned by the user to suit their personal needs and limits. BFI does not have this benefit, especially according to what Chief is telling me about the alternatives I have brought to the table, and according to my observations of BFI implementations out in the wild.
I would agree the tech industry disappoints. I am forced to give up on gaming industry as it went so bad there are close to no new games I'd like, and I'd rather play old games as they have aspects I like and miss in modern gaming. Not just nostalgia, mind you, although surely this plays some role. So I can understand your frustration.
That said, I have a 180° different opinion. It's the VRR which I see as trojan horse. The tech pushed over BFI, the tech which slows down the popularization of BFI and the tech which makes it harder to get displays with BFI implementation which is actually usable. The tech which benefits the monitor/TV manufacturers more than it does benefit users. It's used to rise the price upwards. For example, you could get a 250€ 1080p 144Hz monitor. Then it was removed from the market and replaced with the same thing, but with VRR. But now it was 400€.
I see the benefits of VRR as not even in 10% as important as motion clarity. The idea that BFI is overly pushed baffles me. Sorry, I just don't see it.
Once again, strong disagreements, and this boils down to the kinds of content that is on screen. As I explained earlier, VRR is genuinely useful in situations where frame rates, and more importantly frame times, are not consistent. And this is in just about everything that you or I have used. In fact, inconsistent frame times are how the GPU gets any work done at all without a massive penalty in latency. GPU's if you don't know are dynamic electronics, and will dial in their workload based on power, thermals, and what exactly is asked of it to do from the millions of API's that does the same thing in a million ways. Funny enough, a dev I know has been working on a software-sync tech that he had hoped to "kill VRR once and for all", but he kept running into issues of practical benefits vs VRR and limitations due to smooth-brained game devs blatantly defying tech standards.

And that is the point here; VRR takes care of the overall situations regardless quite nicely, and the "alternatives" require everything to function exactly as expected, which is comically unrealistic. BFI as an example, are only "ideal" with the expectation that the content you watch/play has a completely perfect frame time graph with absolute zero variance. The moment a spike occurs, you are going to have a very bad if not painful time with the display. It was a part of why I had strongly suggested a change of blanking philosophy.

At the end of the day, there still hasn't been anything close to VRR when it comes to general-case improvements, and Apple has set the table in VRR standardization with the ProMotion in the iPhone 13. I want a situation where BFI is automatically compatible with VRR due to this thing called real world usage, and the current "BFI or VRR" binary choice is both irritating and nonsensical.
Well, yes. I wouldn't call them inept. It's just what the boss tells them to do. They're busy. They won't waste 5 minutes on a thing for which they are not payed.
Nintendo Switch doesn't have BFI despite the fact the flicker is not as visible on a small screen.
Faceboolus Quest 2 doesn't have a PC port.
New LG OLED panels for 2022 don't have the proper BFI customization menu while the older line (2021) had it.
PS5 doesn't allow the user to disable automatic low latency mode, which DISABLES the BFI option on most TVs (this is simply insanely stupid)

and so on, and so on..
It is not even that. The business people are the relative leaders, and the engineers constantly paint themselves as leaders of information. Due to the nature of what they do, they are both secluded to their own bubbles of yes men. The business people, notably CEOs and especially investors, have shown very little ability to care about what they have gotten into. It is about extracting the most money regardless and driving profits and constant "growth" without a care in the world about what that growth entails or what penalties it has. Capitalism, baby!

As for the engineers, they are not a lot better in the grand scheme of things. They have a holier than thou attitude with just about everything under the sun, even when the subject in question involves a lot of social context. I have seen for myself the effects of anti-social behaviours and how it is so commonplace with the engineers. They often portray themselves as the ultimate information gods, and they are rarely willing let alone able to accept critical feedback for the information they provide, especially where social context and understanding is involved. If you provide them the kinds of information that is beyond their limited scope of understanding, it will be silly not to expect them to become an incredible nuisance to talk with. I will just leave it to you to guess and figure out what specific kind of toxic mindset they would unironically praise.

The matter of fact is, you cannot confine yourself to the physics "as is" or the "raw science" of a situation. Social context is completely unavoidable, and that is a part of the point I wanted to deliver here in this thread. Understanding personal limits and unbiased views of a tech is a million times more important than focusing on some dreamy desires of a few and being selective about the benefits and drawbacks. With the specific solutions I had brought such as interlacing and checkerboarding, the effects would have strictly speaking been a placebo, but in the scope of user reception and feedback, that does not really matter. A nice looking placebo will always be preferred over a "true" implementation that comes with drawbacks, and is just how it is.

Low latency is very important in gaming, especially in the context of HFR, and even in 60Hz, so the need to disable such a mode is highly questionable in that scope. Sure, you may argue about it activating only in games, but that invites a hornets nest of situations where low-latency is still desirable and BFI is much less useful in relation. And as I said, BFI in the wild today only has one setting, on and off, and with such problems it has in relation to the real world and the absurd lack of fine tuning, it raises a question mark for why anyone would want BFI in a gaming system.

RonsonPL
Posts: 122
Joined: 26 Aug 2014, 07:12

Re: Better BFI Algorithms (Split Blanking / Timings / Progressive / etc)

Post by RonsonPL » 12 Jun 2022, 06:46

edit: sorry, got to run, I probably left like 30 typos in here, and can't fix it now, sorry in advance
Colonel_Gerdauf wrote:
11 Jun 2022, 11:01
You are describing all of this as a matter of marketing. In that respect, there are other issues that you yourself have acknowledged, and I will get to those later. But this issue is not specific to "gaming displays". Look around, and you will notice that 24" 1440P is very hard to find as a whole, even in the enterprise, budget/scam, and general markets, and those that are there are ancient panels on clearance. The taste for IPS/VA is understandable to a point, but we are getting to the point in refresh where you cannot actually push refresh and response times any further. The general G2G for IPS is around 7ms. G2G is not a fantastic measurement by the way of how fast pixels actually change states, but without any common ground here it is the best we have. Regardless, refresh rates do not work by magic, they rely on the limits of how fast pixels can fully change, and the rate of speed that would encompass a 120Hz display is 8.33ms. 144Hz would be 6.9ms, 165Hz would be 6ms, 240Hz would be 4.2ms, etc. I think you can see where this becomes weird. The HFR displays using IPS ultimately use quite a bit of number-fudging and masking tricks to make it appear like IPS can do everything when it has clear limits.

This was a part of my hesitation towards the push over retina refreshes, as in order to actually make kilohetz displays, you would need a brand new display tech that can surpass the pixel limits of even TN.

And I have not even gotten to the specific limitations of VA or IPS, which can ruin the intended experience and put things into a matter of drawing a lottery. It does not need to be this way, but QA is notoriously inconsistent with strange criteria.
But of course. I would go a step farther. I am holding an opinion that all the LCD progress is just BS. I can understand the difference between 2000 monitor and 2010 monitor, sure. I remember laptop displays in 1995. Surely there's no comparison. But since the introduction of LED backlight and strobing, there was no progress. TNs from a decade ago are not better or at least comparable to the TNs produced now. As to the VAs - the improvements are still far from a "game changer". They were too slow 20 years ago. They were too slow 15 years ago. They are still too slow now. You cannot get motion clarity comparable to a CRT even at 60Hz. Monitors based on VA sold with "240Hz!" labels are pointless. It's just way too slow and higher refresh rate will only improve latency, although there are far more important factors than display latency after 120Hz, which affect the total motion-to-photon lag.
So of course, LCDs are too slow for HFR. Chief made a test of 540p 480Hz prototype years ago, and since then, we more-or-less know what's possible. LCD even with a TN is way too slow for perfect 480Hz. But at least it gives an usable image in motoin. It's at least something. Obviously VAs will never reach this level and IPS are a bit better than TNs, but for gaming, they're still bad in terms of contrast. They can come close to the TNs but it's impossible for an IPS to be as fast as TN, so let's pull these out of the equation.
Like I mentioned, there is more to a good display than having good refresh, or even good resolution in isolation. If you are going to shove in my throat the "super duper fast" displays which have the comical 95 PPI, then you have wasted my time; it is as simple as that.
I would say exactly the same about motion clarity. If you give me a Microled, absolutely perfect monitor/TV with 150" screen size, with 16K resolution, perfect image quality in every aspect, but 120Hz without any strobe/BFI technology, I wouldn't spend even 100$ on that if I was buying a gaming display.

Once again, the scope of source material needs to be taken into consideration. Where BFI would have helped is the lots of content which are capped to 24 30 and 60 FPS, as everything else is variable (thus VRR being the actual holy grail) or at the very least capped high enough where BFI would not be necessary.
As a person who absolutely hates the idea of pushing VRR before perfect motion clarity is estabilished and well understood widely in the industry and among consumers, I must disagree once again. You wrote "everything else is variable" and that BFI benefits the 24 and 30fps. That's not true. 24 or 30fps is not enough temporal resolution to reconstruct even medium speed motion. Even if you use A.I from year 3000. It simply won't catch objects within frame and accurate AI prediction based on 100% guess is impossible. You can clearly observe this in racing games. Play a blur free racing game and start accelerating. Once you cross a certain threshold of speed, you will see the illusion breaking apart. With wide FOV racers at very high speeds, even 60fps as a source may be insufficient. Actually, I can clearly tell the limits of 60fps and 120fps on my CRT-like TN, low persistence (BFI) monitor in a future racing games where you go over 600km/h. So, back to 30fps, it will start failing at very low speeds, unless you switch to narrov-FOV view and pick the camera hovering above the car to slow the perceived speed down. As to the "others are variable". No. There is no well estabilished standard for above 60fps content, sadly, but I don't see where did you get the "all is variable" from. There is no single variable standardized content as far as I am aware. There is 120fps content. Some. Not much.
Variable refreshrate is a trojan horse on so many levels. Even for the future. If you get the locked content, let say 90fps. Or 100fps. You can interpolate it well to 1000fps. But if you get variable, there will be an issue interpolating it well without artifacts or jitter.
One day the sun will come out again, we'll eventually get something good for motion, maybe MicroLEDs in 2030s. And seriously, I'd rather have 90fps locked game than 95-120 VRR. Latency improvement is small. Motion clarity issue is way more important.
Those would involve the balancing act between "tolerable" flicker and the noticeable reductions of motion blur. You can get around the original issue here by simply using higher clock ratios such as 1/4 or 1/6, but it gets to the point where the reduction gets "neutered" so that is a moot point.
That's the idea I mentioned earlier as stupid. Any variation from fps=hz rule will absolutely ruin the picture in motion and render the whole BFI idea pointless.
But about the compromise. I think 120fps content should be the target, the "holy grail". For people who still complain about flicker at 120Hz (that would be the minority of people, mind you) there should be an easy option to interpolate from 120fps to 240fps BFI or to HFR once the display technology allows that. Latency penalty will be reasonable and for competetive games or simply fast shooters controlled with a mouse, people could just pick the desired mode. A match lasting for 40 minutes will not cause headaches at 120fps BFI for at least 90% of gamers. I'd assume more towards 99%.
On that note, why exactly are the blanks clocked? That was never sufficiently explained to me. It would have been far easier to fine tune, and electronically more simple, to have the blanks timed in relation to the active frame, and have the blank done at the start of the frame.
You want to display the frame as soon as you can to minimize the latency. You display the frame and the amount of time it remains on the screen determines the precission in motion representation. The longer it stays on screen, the more blurry the perceived motion will look like. On the other hand, the longer the frame remains visible, the more bright the image will be, and less flicker will occur. So you need the blank to be at the time it makes the most sense. If your question was about more technical aspects, then you'll need to ask the Chief as I am not the guy who knows much about the engineering science behind displays.
There is physics, and there is the user experience. Physics is static, meaning that there is always a workaround to a problem somewhere. User experience is a mess that requires proper social awareness at the very least, to sift through. I will get to the latter point later. If it is truly a matter of "eye hurt, or motion blur", then that is about the most idiotic kind of compromise to make, and especially foolish when it is being shoved to the general user.
I am not the guy who brags about stuff whenever he can, and I have very little knowledge about anything else, but in terms of motion clarity, I can say I'm a practical expert, without lying. I was discovering the issues with motion 20 years ago, finding the stroboscopic effect which negatively impacts motion, fascinating. I was playing around with games at 170fps 170Hz. I was testing 50fps content at 50Hz. I was testing 85fps content at 170Hz. I started looking at motion interpolation as soon as it appeared, I've spent hundreds of hours comparing CRT and LCD monitors side by side. Even three monitors at one point. CRT, strobed LCD and a non-strobed LCD. I've spent tens of hours analyzing the psychology side of this.
So. What I meant by using "physics" is that there's no way to change how human eye-brain system works. You follow the object, then the said object should remain within the motion vector as Chief's article "Why OLEDS shave motion blur" explains. You simply cannot disrupt it and expect no negative effects. What could've been invented was already invented. The rolling scan BFI has some upsides but it also has its downsides, about which you'll learn more if you watch some Michael Abrash and John Carmack keynotes from Oculus Connect conferences from 2013-2016 (sorry, I don't remember which one). They explain the problem of the image "bending" at specific motion in VR. The idea to simulate CRT's phosphor fade the Chief talks about, is a good counter-flicker remedy, but that's basically it. You cannot invent something more round than a wheel.
There's a lot to be done with display tech, with implementations, but you cannot go against what's in the "Why do OLEDS have motion blur" article says. It simply wouldn't work unless you introduced some computer-brain interface to bypass the human eyes which cause the issue due to their physical/biological nature.

In a technical level, I very strongly disagree. First, the thing about HDR is not about being super-bright. When it talks about nits, it is speaking of the relative vibrancy values from section to section. Think about it like the power rating for a computer PSU; it is only talking about the maximum that can be reached when/where is needed. This is actually really close to real life, as when you walk outside, there are sharp differences in "brightness" everywhere that is not properly captured in a regular display. Looking at the real world hasn't bothered people, so why should HDR?
As with the searing brightness, that is down to faulty implementations in case situations of specs versus the content on display, and that can be easily adjusted on a hardware as well as OS level. In fact the issue on Windows has been dealt with thanks to AutoHDR, as HDR mode is no longer static when turned on, it can dial itself down when looking at SDR content, even on a pixel-to-pixel basis. In addition, the parameters can be tuned by the user to suit their personal needs and limits. BFI does not have this benefit, especially according to what Chief is telling me about the alternatives I have brought to the table, and according to my observations of BFI implementations out in the wild.
[/quote]

As I wrote earlier: Because you have 220° in real life and the TV covers like 20-50° of it. Because in real life, you don't go from night to middle of the summer sunny day in 1 second. Because it matters at what TIME you play on. Because you are a biological machine, which is programmed to work in certain ways though thousands of years of evolution. I'm far from an expert on health but I'll try explain some things to you anyway. Feel free to dive into proper science sources to learn more about these things, if you are not scared of changing your opinion on HDR.
First of all, lets start from the invalid argument I saw multiple times in most of the pro-HDR reviews and articles. That real life can easily hit 10 000 so TVs should get that too. That's true, but the fact the sun gives a ton of light, doesn't mean you can look at it without damaging your eyes. The specular reflection on the metalic car hood on your example, is what damages your eyes in real life. You should minimize looking at such things as much as you can, because with age, your eyes will get damaged. Caring for your eyes will mean you start having issues with night vision and color vibrancy at the age of 80 instead of the age of 50. Or at the age of 30 if you believe that the idiot in hospital who calls herself a doctor, deserves to be called one. I've lost like 20 years of brightness degradation within one single day, cause the "doctor" told me I can go home even when I asked "but to analyze my eyes, you applied the fluid which made my eyes turn into night mode, losing the ability to adjust to higher brightness. Won't that hurt my eyes?". She said "no, it won't or if it will, you won't notice it, just don't stare at the sun". That was a lie. Trust me. You want you eyes to age later rather than sooner.
Why did she tell me that and what it has to do with HDR? Well, it's about the fact we are all different. Different age, different health, different genes. What you can stomach with your HDR TV set without harfmul effects, may hurt someone else, permanently. Just as my eyes were more prone to damage than the other patients the idiot doctor dealt with, you can hurt someone by generalizing too much.

Then there is the fact it's very hard to measure the damage. Even harder to notice it. It's not like you can see it from day to day (unless it was a very severe damage but that's unlikely from a single game session on a HDR TV). Just like you don't realize your hearing to go worse every year, but if you could hear as you did 20 years earlier, you'd immediately notice a big difference (for example, no 50 year old can hear the high tones which are normal to hear for young people).
The fact that close to nobody complains about health issues yet, doesn't mean there's no issue.

So, in real life, your whole FOV gets the higher brightness. It gets time to adjust. The body regeneration system works better to minimize the damage at 2PM, contrary to the situation where someone plays a HDR game in the middle of the night.
Then there's how prone to the damage you are. Sleep defficiency or bad diet can affect it more than you think. We're talking about gamers here. All your "safety systems" are useless until you connect the probes to the brain and analyze what happens in the eyes, which is obviously not going to happen. I didn't mean just the mismatch coming from non-standarized HDR chaos, the HDR10, Dolby HDR and all the other. The issue that various TVs have different panels. The fact people set their TVs as they please. And so on. I understand this was a big problem, but I'm arguing here that even if you ignore this issue completely, HDR is still a threat to health. I can see it becoming less harmful for VR, as it's possible to know about the whole user's FOV. But even then, you'd need to adjust by per-user basis to adjust for what the given user's body can safely handle. And this varies a whole lot.

Lastly. There are newest studies suggesting we underestimated the imporatnce of NIR (near infra-red) light importance. In real life, you get the radiation in the morning, which "arms up" your cells to be prepared better for the upcoming damage from the sunlight. After the noon passes, before the sunset, you again get another portion of IR and NIR light without much of harmful light wavelenghts. Again, aiding your body's natural defense system to deal with the damage. Some people don't go out at all. Have bad health. They are often gamers. Pushing the HDR may turn out to as harfmul as cigarretes or "tons of sugar in everything". This is IMHO a big issue and is on direct path towards hurting people. Lots of people.
Don't get me wrong. I understand what HDR's appeal is. I switched from a CRT monitor made in 1990 to one made in 1996. Then a few more times from used CRT to a new CRT. Then, the biggest difference - from CRT monitor (like 90) to a TV (brightness more towards 200). The difference in contrast, color vibrance, shadow detail - it was absolutely huge and the fact I have no choice but to use the crappy TN, doesn't mean I'm an idiot who would say HDR doesn't help in certain content and certain environment (especially when you watch something in a sunny day, in a sunny room, then having some parts of the screen at even 6000 nits is surely a good idea!).
But what happens with HDR now, the whole craze, it's simply insanely stupid and dangerous. Especially when you realize the color vibrancy was taken away by postprocess effects games, to make the games look more natural/realistic, and now they push HDR displays so you can get some of that back. If you ever used a high-end CRT TV of smaller size (bigger size = less brightness per inch) you surely won't say it was missing anything in terms of brightness, and general "punch". The lights were lit, the fire was "burning", all as it should.
But not in a sunny roon at 1PM. And maybe not for 50 year old people who write pro-HDR articles now. Well, yeah, maybe not for those.

Once again, strong disagreements, and this boils down to the kinds of content that is on screen. As I explained earlier, VRR is genuinely useful in situations where frame rates, and more importantly frame times, are not consistent. And this is in just about everything that you or I have used.
I disagree with your disagreement. If your content frame rate is inconsistent, your content is broken. Simple as that. If you could make stable 60fps on Playstation 1 in 1995, you can surely do stable 120fps on PS5. But the industry's level of professionalism fell by a lot and now you get insanely stupid situations like DIRT 5 game which holds 90fps on consoles, most often holds 100, but has no 90fps stable framerate mode on consoles, no 100Hz. And 120Hz it has won't work with BFI as it has to be 120fps at 120Hz to make it work. That's not the problem with BFI. That's the problem with idiots ignoring the issue and benefitcs of BFI, so it's underutilized and basically ruined. Why would the console lack 90Hz output? There's no hardware reason to not offer this. OK, let's pick something more mainstream. 100Hz. Nope. Not even that. Why wouldn't the game dev set the optimizations parameters to 120fps? Surely if you can get Switch ports of PS5 games, you can move some damn slider a bit to improve the minimum framerate reads. That often doesn't require any work, as the engines used for most of the games can do these things automatically. You'll just get more popin, less detailed and more jaggy shadows etc. Surely better to offer such a compromise than no mode capable of flickerless motion or reasonable quality in a racing game (racing! - so dependant on speed immersion!)
As to other content. Again, if your non-gaming content is VRR, it's broken too. I won't talk about this though, as I don't know anything about VRR non-gaming content and honestly, don't want to. I wouldn't watch such a thing anyway, unless it can be displayed with BFI properly (so maybe a movie at 100-120fps interpolated to 1000, then sure, why not ;) )

Well, you'll see the issue clearly as even your "tribe" (VRR over BFI ;) ) will feel the problems of industy's stupidity once we'll start getting stupid UE 5.0 games at 30fps. At 35-40fps no VRR will be able to work anyway. By this logic, it would need VRR to change as the 35fps content clearly shows it's not beneficial ;)



In fact, inconsistent frame times are how the GPU gets any work done at all without a massive penalty in latency. GPU's if you don't know are dynamic electronics, and will dial in their workload based on power, thermals, and what exactly is asked of it to do from the millions of API's that does the same thing in a million ways. Funny enough, a dev I know has been working on a software-sync tech that he had hoped to "kill VRR once and for all", but he kept running into issues of practical benefits vs VRR and limitations due to smooth-brained game devs blatantly defying tech standards.
I'm no dev, but I've been reading some stuff about GPUs since before the first one arrived on the shelves and I know about frame pacing issues. And about double and tripple buffering techniques.
But for the stable framerate, it matters more what your CPU single threaded performance is. What your CPU performance is, what CPU-RAM (or CPU-cache in certain scenarios) performance is. If you have issues with frame delivery, it's usually either do to how your engine was designed (badly) or you suffer from CPU insufficient perf.
And that is the point here; VRR takes care of the overall situations regardless quite nicely, and the "alternatives" require everything to function exactly as expected, which is comically unrealistic. BFI as an example, are only "ideal" with the expectation that the content you watch/play has a completely perfect frame time graph with absolute zero variance. The moment a spike occurs, you are going to have a very bad if not painful time with the display. It was a part of why I had strongly suggested a change of blanking philosophy.
From my POV, it achieves nothing but some latency improvements which are marginal to other latency-affecting factors. The inconsistent framerate without BFI, will always mean the image in motion is still completely broken and unusable. The fact the game will feel more smooth and more responsive doesn't change that. And the fact the devs are too stupid to control their framerate or to prioritize it over fancy new graphic rendering technologies, is another story.


As for the engineers, they are not a lot better in the grand scheme of things. They have a holier than thou attitude with just about everything under the sun, even when the subject in question involves a lot of social context. I have seen for myself the effects of anti-social behaviours and how it is so commonplace with the engineers. They often portray themselves as the ultimate information gods, and they are rarely willing let alone able to accept critical feedback for the information they provide, especially where social context and understanding is involved. If you provide them the kinds of information that is beyond their limited scope of understanding, it will be silly not to expect them to become an incredible nuisance to talk with. I will just leave it to you to guess and figure out what specific kind of toxic mindset they would unironically praise.
Oh the irony. You wouldn't believe how toxic people get when I argue about motion clarity importance. :D
Even if I present facts and explain.
The matter of fact is, you cannot confine yourself to the physics "as is" or the "raw science" of a situation. Social context is completely unavoidable, and that is a part of the point I wanted to deliver here in this thread. Understanding personal limits and unbiased views of a tech is a million times more important than focusing on some dreamy desires of a few and being selective about the benefits and drawbacks. With the specific solutions I had brought such as interlacing and checkerboarding, the effects would have strictly speaking been a placebo, but in the scope of user reception and feedback, that does not really matter. A nice looking placebo will always be preferred over a "true" implementation that comes with drawbacks, and is just how it is.
But understanding the importance of motion clarity and benefits of BFI (due to technology not being available for any other solution within near term future) cannot reach mainstream. It becomes a niche because it's hard to understand. As I mentioned, it has too many pitfalls and people are not willing to become scientists to analyze what happens in their own brains while they are having fun (or not). I'll give you an example: I posted like 1000 forum replies and threads about it. Most people disagreed and became toxic after I refused to accept their narrative that I'm just a weird guy who focuses on motion too much. The average ratio of convinced people was like 1/20. But about 15 people bought the monitors I recommended and gave the BFI a try. Out of those people who never playedon clear motion display before, 80% sent me DMs later on, thanking for me being so persistent to convince them. Out of 5 people I made a "presentation" for, sitting them down to 2 monitors side by side, the convinced number of people came out to be 5. Not convinced: 0. Out of those 5, 2 were really strongly stubborn and sure they won't change their minds.
And.. out of those convinced people, some turned back around after some time, due to the psychological trap I'd have to write a whole wall of text about, but in short: the information in the brain evaporates and the gamer stops being aware of what he/she's missing.
Low latency is very important in gaming, especially in the context of HFR, and even in 60Hz, so the need to disable such a mode is highly questionable in that scope. Sure, you may argue about it activating only in games, but that invites a hornets nest of situations where low-latency is still desirable and BFI is much less useful in relation. And as I said, BFI in the wild today only has one setting, on and off, and with such problems it has in relation to the real world and the absurd lack of fine tuning, it raises a question mark for why anyone would want BFI in a gaming system.
But the latency is not dependant on VRR alone. It's not even the major part of it.
Once you move from 60fps v-sync, to 120fps v-sync, the latency improves by a lot. But if your code is properly written and the environment configured (OS, API, driver etc.) it's good enough even without VRR. Then, if you need more, you can simply disable v-sync. The moving motion quality will degrade by a lot, but still won't be worse than VRR without BFI. This means you can get really comparable or even better latency without VRR if you play at 120. And if you are a game dev creating a game in which latency matters even a bit, you should NOT even consider making your game for anything under 120fps.

Then, we have higher refresh rates. At 240fps, difference between VRR and vsync will be microscopical.

Look around. What happens in the industry. We are about to enter into the 30fps swamp again due to the hype for UE5.0 and ray-tracing. Look at the RDR2 game. This prioritized animation quality and buffering techniques to push the visuals so much, the latency reached 150-200ms. Even 250ms in some cases. The same happened with GTA IV and V on PS3 and x360.
The 60fps online shooter games can vary by TWICE as much just because of how they are written. You can get just a fraction of this difference thanks to VRR. Seriously, there's no need to push VRR everywhere. You just don't push 4K, tripple buffer and super complex shaders into a fricking Quake Arena-like games. ;)
(and yeah, that happens in the industry nowadays)

So, instead of fixing their wrongdoing, by improving the games and approach in displays (TV and monitor manufacturers are also to blame) we got something which by trying to fix it a bit (latency) is braking it even more (vrr is the enemy of BFI and motion clarity nowadays, at least for now).
The same goes with variable rate shading, which is trying to save up on some GPU power draw to get more performance in static image, but at the cost of motion quality.
You shouldn't turn the problem upside down. Just like I wouldn't agree in 2030 that motion clarity is completely not important cause 100% of the games uses the stupid temporal AA methods, motion blurs and the image looks like Fallout 76 ( a synonim ;) ) in motion due to variable shader rate anyway ;)

Post Reply