Software Based BFI [Is it Better than Hardware Sometimes?]
-
- Posts: 25
- Joined: 25 Mar 2024, 12:59
Software Based BFI [Is it Better than Hardware Sometimes?]
I realized I had this up in the wrong section. Im looking for some clarity on how this works. Is hardware BFI superior? What about displays that are high refresh rate 240hz, 360 hz, etc. that dont have HW BFI? What about SW BFI on lcds vs oled? Will the usually slower pixel refresh times of a VA panel, for instance, be problematic? Lets say the software(reshade, retroarch) inserts a black frame every other frame at 120 hz or 180 hz. Does that double the motion resolution? Or is motion resolution capped at the panels max refresh rate?
- Chief Blur Buster
- Site Admin
- Posts: 11895
- Joined: 05 Dec 2013, 15:44
- Location: Toronto / Hamilton, Ontario, Canada
- Contact:
Software Based BFI [Is it Better than Hardware Sometimes?]
You came to the right place for BFI science!i_apocalypseon wrote: ↑28 Mar 2024, 12:57I realized I had this up in the wrong section. Im looking for some clarity on how this works.
More reading can be found in the Blur Busters Research Portal (Area 51)
Short Answer: "Depends. Yes & No."
Long Answer:
Usually, hardware-native BFI outperforms in most metrics, especially persistence. You can get less motion blur if you're able to have sub-refresh BFI. Basically inserting black frames without needing extra refresh rate.
However, there are some cases where software BFI outperforms hardware BFI. For example on LG OLED 4K TVs, a video processor called the Retrotink 4K does BFI with a brighter picture and a lower latency than the television's built in BFI.
Advanced Explanation of Brighter: Normally, BFI will dim the picture. But Retrotink converts SDR content to HDR, and brightens the picture to compensate or the BFI dimming. So you can do 50% BFI (120Hz BFI output for 60Hz input) with the same brightness as non-BFI, when doing the Retrotink 4K on LG OLED TVs.
Advanced Explanation of Less Lag: Partial beam racing techniques in FPGA. This is because Retrotink 4K does a beamrace buffering (raster scanout) of the input signal, which means it begins outputting the visible frame only half a refresh cycle after the input signal has been buffered. So the human eyes can see the top of the output visible frame, even before the input frame has been fully buffered. Although the input signal is slower scan 1/60sec and output signal is faster scan 1/120sec, the box can still begin outputting the visible frame a little in advance of finishing buffering the input frame. (Remember, signals transmit one pixel row at a time, top to bottom, for both cable and for panels, ala high speed videos at www.blurbusters.com/scanout ...)
In theory, a pull request modification to Retroarch can do all of this (especially if using lagless vsync which WinUAE supports but nobody has added it to Retroarch yet)
BFI on OLEDs massively outperforms BFI on LCDs, because there's less image degradation from various logic such as LCD inversion, and you avoid the LCD image retention issue from software black frame insertion. You can use a odd divisor (InputHz:OutputHz) or insert a phase-switch frame. The LCD Saver mode on Retrotink 4K does this automatically, for example.i_apocalypseon wrote: ↑28 Mar 2024, 12:57What about displays that are high refresh rate 240hz, 360 hz, etc. that dont have HW BFI? What about SW BFI on lcds vs oled?
Yes it definitely does. Software based black frame insertion is massively superior on OLEDs versus LCDs.i_apocalypseon wrote: ↑28 Mar 2024, 12:57Will the usually slower pixel refresh times of a VA panel, for instance, be problematic?
However, LCDs can still do it, but you should consider using a strobe backlight such as ViewSonic XG2431, which can do 60Hz hardware strobe, see www.blurbusters.com/xg2431
Motion blur is proportional to pixel visibility time.i_apocalypseon wrote: ↑28 Mar 2024, 12:57Lets say the software(reshade, retroarch) inserts a black frame every other frame at 120 hz or 180 hz. Does that double the motion resolution?
More input Hz is better for reducing 60Hz blur.
120Hz BFI for 60fps can reduce 1/2 = remove 50% of 60Hz blur
240Hz BFI for 60fps can reduce 3/4 = remove 75% of 60Hz blur.
360Hz BFI for 60fps can reduce 5/6 = remove 83% of 60Hz blur.
Please study this animation: TestUFO Variable-Persistence BFI Animation For Viewing On 240Hz Monitors. If you don't have a 240Hz display, reduce the number of UFOs accordingly, to get the highest BFI framerates given your limitations in maximum refresh rate.
In general, mathematically,
Motion blur of BFI/strobed/impulsed = pulsetime
Motion blur of pure sample and hold = frametime
So you can understand display motion blur (eyetracking based motion blur, ala www.testufo.com/eyetracking or www.testufo.com/framerates-versus ...) is dictated by the length of time a unique frame (pixels) is contiguously visible for.
But you also want to avoid multiple-impulsing an image, or you get duplicate image artifacts (like CRT 30fps at 60Hz or strobed 60fps at 120Hz).
Even software BFI can generate this: TestUFO BFI Double Image Artifacts, so make sure your flicker-rate is the same as frame-rate, if you don't like duplicate images.
For software-based BFI without any help from the hardware -- YES.i_apocalypseon wrote: ↑28 Mar 2024, 12:57Or is motion resolution capped at the panels max refresh rate?
Motion blur cannot be less than maxHz refreshtime.
The only way to get better is hardware-based methods that can do sub-refresh-based black frame insertion (temporarily partitioning a refresh cycle into multiple "refreshes", like a visible frame followed by a black frame, timed in one refresh cycle. Or a strobe backlight, etc.)
However, for desktop OLED, even most hardware-based BFI is limited by MaxHz, becuase some (not all) OLED panels are unable to do sub-refresh BFI. So you can only do BFI at less than max Hz.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on: BlueSky | Twitter | Facebook
Forum Rules wrote: 1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
2. Please report rule violations If you see a post that violates forum rules, then report the post.
3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!
-
- Posts: 25
- Joined: 25 Mar 2024, 12:59
Re: Software Based BFI [Is it Better than Hardware Sometimes?]
Man I really flunked this time. I had no idea ur reply was sitting here for well over a month. Thanks so much chief! Thats really informative. Even though I didnt quite grok everything but still this gets us most of the way.
-
- Posts: 25
- Joined: 25 Mar 2024, 12:59
Re: Software Based BFI [Is it Better than Hardware Sometimes?]
Does this mean hardware bfi on a va led can achieve motion clarity higher than the maxhz? Like say its capped at 165hz.Chief Blur Buster wrote: ↑02 Apr 2024, 14:47However, for desktop OLED, even most hardware-based BFI is limited by MaxHz, becuase some (not all) OLED panels are unable to do sub-refresh BFI. So you can only do BFI at less than max Hz.
What about using software bfi like desktop bfi or retroarch bfi with a 240 hz panel? Is that a good compromise where hw bfi is not available? Also ony 60hz va retroarch bfi flickers a lot.
- Chief Blur Buster
- Site Admin
- Posts: 11895
- Joined: 05 Dec 2013, 15:44
- Location: Toronto / Hamilton, Ontario, Canada
- Contact:
Re: Software Based BFI [Is it Better than Hardware Sometimes?]
LCD motion blur reduction can be extremely good and vastly exceed OLED motion clarity, with a bunch of compromises (quality best at framerate=Hz, less brightness, added flicker, typically poorer colors, more stroboscopic effects, etc).i_apocalypseon wrote: ↑12 Oct 2024, 14:17Does this mean hardware bfi on a va led can achieve motion clarity higher than the maxhz? Like say its capped at 165hz.
There's a technical catch too: To avoid strobe crosstalk, the speed of LCD pixel transition (GtG, transition) must be hidden in the dark cycle between strobe flashes. Some examples of high speed video of strobe backlights:
High speed videos of LightBoost
This is because not all pixels refresh at the same time:
High speed videos of LCDs and OLEDs
Software BFI is great for getting access to low refresh rates like 60 Hz. Sometimes you want a longer duty cycle (only 50-75% blur reduction). But software BFI performs much better on OLEDs since BFI acts at the panel level rather than backlight level, and requires fast GtG to really look good. Also, be noted about inversion artifacts.i_apocalypseon wrote: ↑12 Oct 2024, 14:17What about using software bfi like desktop bfi or retroarch bfi with a 240 hz panel? Is that a good compromise where hw bfi is not available? Also ony 60hz va retroarch bfi flickers a lot.
There are some things you can do better with LCD, and other things you can do better with OLED. It's largely a personal preference, but one can usually tolerate slightly more motion blur if it's "clean" motion blur (e.g. no crosstalk, no ghostbehind effect, etc)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on: BlueSky | Twitter | Facebook
Forum Rules wrote: 1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
2. Please report rule violations If you see a post that violates forum rules, then report the post.
3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!
-
- Posts: 25
- Joined: 25 Mar 2024, 12:59
Re: Software Based BFI [Is it Better than Hardware Sometimes?]
Right. Thanks Mark! At present I am pleasantly surprised at how well my sony tv implements hardware bfi. I dont observe anything unusual that sticks out. I can compensate for the lack of brightness. But to have so much of retro gaming locked at 60 fps. The led motion mode running at 60 is a treat. And best of all I only perceive flicker when Im browsing or reading text. Never while gaming. Ofcourse I havent gamed a hell of a lot.
Say on a slightly tangent topic, I tried the tvs bfi mode for a movie too. And nothing really stood out. As bad or good.
Say on a slightly tangent topic, I tried the tvs bfi mode for a movie too. And nothing really stood out. As bad or good.
- Chief Blur Buster
- Site Admin
- Posts: 11895
- Joined: 05 Dec 2013, 15:44
- Location: Toronto / Hamilton, Ontario, Canada
- Contact:
Re: Software Based BFI [Is it Better than Hardware Sometimes?]
To look like an improvement (to 35mm film purists), the TV BFI mode needs to be double-strobe or triple-strobe, rather than preserving the ugly 3:2 pulldown during strobed mode.i_apocalypseon wrote: ↑21 Oct 2024, 13:47Right. Thanks Mark! At present I am pleasantly surprised at how well my sony tv implements hardware bfi. I dont observe anything unusual that sticks out. I can compensate for the lack of brightness. But to have so much of retro gaming locked at 60 fps. The led motion mode running at 60 is a treat. And best of all I only perceive flicker when Im browsing or reading text. Never while gaming. Ofcourse I havent gamed a hell of a lot.
Say on a slightly tangent topic, I tried the tvs bfi mode for a movie too. And nothing really stood out. As bad or good.
I don't know if your TV's BFI supports this during movie modes. Plasma displays sometimes did the double or triple strobe mode during movies (24p with each frame strobed an equal count of times). You do get the double or triple image effect, but without the ugly 3:2 pulldown judder, and old 35mm projectors still had this effect too.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on: BlueSky | Twitter | Facebook
Forum Rules wrote: 1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
2. Please report rule violations If you see a post that violates forum rules, then report the post.
3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!