RetroArch - Variable BFI Mod for 180Hz/240Hz/Etc

Talk to software developers and aspiring geeks. Programming tips. Improve motion fluidity. Reduce input lag. Come Present() yourself!
User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: RetroArch - Variable BFI Mod for 180Hz/240Hz/Etc

Post by Chief Blur Buster » 16 Sep 2020, 15:49

<Future Screen Technology Discussion>
ophidon wrote:
16 Sep 2020, 13:45
As for 1000hz screens in the future, as long as they don't remove the ability to run at 'low' Hz like 180 or 240, the issue should remain resolved if it can be with current technology. While I would certainly like to see a 1000 Hz display that could be bright enough at a 1/1000, or even 10/1000 frame pulses for amazing clarity, I remain doubtful that will -ever- exist. :D
Just as Blur Busters love to microphone-drop a lot of outdated 30fps-vs-60fps myths -- that is the wrong perspective (like Newtonian thinking versus proper Einsteinian thinking); because of multiple reasons.

(A) We don't necessarily assume single-Hz granularity.
1000Hz can also mean configurable phosphor-fade persistence, like configuring to 1ms phosphor fade, 1.5ms phosphor fade, 2ms phosphor fade, 2.3ms phosphor fade, simply by having a chasing-rolling-scan behavior of faded alpha-blend-to-black chasing behind the rolling scan. The more Hz we have, the more accurately we can emulate the progress of phosphor decay in a CRT electron gun emulator (Which will make things brighter)

(B) HDR support brings the necessary brightness needed.
I hve seen the Sony 10,000 nit prototype Display at CES. It was an LCD. It was amazing, it wasn't too bright, but used to make highlights brighter (streetlamps at night, neon lights at night, sun glint reflections off a chromed car). Those looked whiter than white. The headroom of 10,000 nit still leaves 625 nits for 60Hz with 1ms phosphor decay. Since HDR is limited to only a few pixels per refresh cycle, the fact we can use a rolling-scan bar, means we can satisfy the "HDR budget" simply by only brightly illuminating a few pixels at a time. CRT phosphor is similarly bright (5-figure nits) so HDR is the knight in shining armour to rescue the CRT emulation problem by 2030+. LED capable of 10,000nit is pretty cheap, the problem is turning that into a FALD-backlight in a three-figured priced gaming monitor. But FALD is no problem if you are willing to pay four figures for a display. Let's not exclude four-figure displays in the quote of "I remain doubtful that will ever excist".

(C) FALD innovations and MicroLED innovations
I've seen FALD LCDs with superior contrast and colors over OLED before; they actually exist. The magic is a sufficiently high-resolution FALD (full array local dimming) that has so many LED elements. Plus MicroLED display are easily capable of simultaneously "super bright HDR" + "super high Hz", with the right driving electronics.

As a person who have seen multiple 1000Hz through 2880Hz displays in prototype laboratories, as well as knowing that LED ribbons (300-LED) have become cheap on Alibaba for under $10 Example 1500-nit brightness 32x32 screen for only EIGHT DOLLARS -- it's a jumbotron module designed to be built into big stadium screens, but hobbyists use them as miniature low-resolution screens too. Some of them cheaply reach 10,000+ nits. There is a time coming where low-resolution MicroLED sheets (e.g. 1000-pixel screens to 10,000-pixel screens) is going to fall below $100. (Even JumboTron 32x32 modules are now only a few dollars off Alibaba). And did you know that jumbotron module internally refreshes at 1920 hertz! For only 8 dollars! If we swapped the frame buffer chip (60Hz) with a custom FPGA, the eight dollar JumboTron module becomes a low-resolution 1920 Hertz screen! Eventually we have JumboTron-superbright FALD monochrome sheets available, which can be commandeered to an ultrabright FALD. Just a few modifications and slap them behind a commodity LCD, and you've got a relatively kick-ass FALD. Right now FALD is the territory of 4-figure and 5-figure priced screens that match or beat OLED quality, so most people don't believe LCD can become as bright and as contrasty -- which is a shame, because there's an amazing amount of headroom available in LCD left.

Now, did you know direct-view MicroLED is simply a miniaturized JumboTron shrunk to retina resolution for desktop sized/wall sized use? And can blow past 10,000 nits?

All the tech exists, just have not been glued together yet into an integrated solution, and will require a beefy GPU with frame rate amplification technology. But ultimately, ultrahigh-Hz combined with ultra-HDR, is not unobtainiumly expensive. 4K was a $10,000 lab curiuosity in 2000, now it's a $299 Walmart special. Ultrahigh-Hz and HDR doesn't necessarily need to be impossible stuff, especially with the slow mainstreamification of 120Hz (new smartphones, tablets, etc). DLP chips are already refreshing at 1440Hz-2880Hz today, using 1-bit temporal dithering to generate low-Hz full color, and are relatively cheap, but aren't suitable for desktop monitors. If you haven't seen thousands of prototype/experimental displays on convention and corporate trips like I have, then judgements of "impossible" are likely premature. ;)

The tech filters down really slowly, but so did 4K (over 20 years). Now you see Apple/Samsung working to mainstream 120Hz this decade, and all the 120Hz initiatives going on (NHK 8K 120Hz broadcasts of Tokyo Olympics). 240Hz might even mainstream by 2040s, while 1000Hz remains an enthusiast curiousity -- as long as there are humankind benefits, if 240Hz can be done for only $3 to $5 extra with no impact on battery life, for 4x clearer browser scrolling, Apple/Samsung will do it for example. It's a matter of economics of the falling cost of high Hz. 2030? 2040? 2050? But it isn't "Never". For mainstream 240Hz. Who knows? What this does, is push the bleeding edge up further (like 1000Hz ultra-HDR displays), the pressures of technology progress, which solves our CRT-electron-beam-emulator problem.

I'll leave that as a Micdrop.

I see the canary in the coal mines of the emerging technology of the future -- so never say never.

It may be a while before HDR converges with ultra-Hz, but it will happen before 2100. Perhaps by 2025. Or 2030. I'm guessing 2030-ish. But it is not equivalent to "Never". There are quite a lot of engineering paths towards a cheap 1000Hz that also includes HDR -- but probably will take approximately a decade-ish (give or take, could be 2025 or 2040).

The "never will happen" stuff tends to be Newtonian thinking around Blur Busters -- winky wink. ;)

This is the type of stuff Blur Busters mythbusts -- Blur Busters mission also includes to stop the 1000Hz laughing relatively quickly, with articles such as Blur Busters Law: The Amazing Journey To Future 1000 Hz Displays. Even the diminished 240Hz-vs-360Hz difference (from LinusTechTips) is partially from GtG limitations, combined with LinusTechTips testing methods -- GtG limitations lagging behind Hz innovations is much like how the first 240Hz monitor from 2017 was more ghostly motion-blurry than today's more-refined 240Hz monitors.

Most of the mainstream masses are spoiled by the mostly unchanged look of desktop LCDs from 2005-2020. From the first Samsung 2ms TN 1080p/1200p displays today's 1ms 1080p 240Hz displays, they are all similar contrast ratios of about 1000:1, similiar brightnesses of 200-to-400nits, and with similar color gamut (~72% NTSC) even when transitioning from CCFL to the first LED backlights. If you're one of the masses spoiled by this, you ain't seen anything yet -- have you ever seen a desktop monitor driven by a 10,000-LED backlight? Totally different ballpark, sometimes better than OLED. It looks like an OLED, sometimes better, and the brightest whites exceed >1000nits. With tens of thousands of LEDs in a locally dimmed backlight, the blooming is gone, smaller than the bloom around a CRT phosphor dot.

Eventually, monochrome FALD MicroLED sheets will be machine-manufactured mass manufacture stuff -- since 10,000-LED monochrome MicroLED displays are much cheaper than 1080p direct-view MicroLED displays / JumboTrons -- so ultra high quality FALD LCDs in the sub-$1000 price range will arrive before direct-view full-color RGB FALD MicroLED displays (ala miniaturized JumboTron, or miniaturized desktop-sized Samsung "THE WALL" display panels). So expect nit-rocket FALD desktop monitors before then. Just set up a factory to manufacture FALD sheets as cheaply as cheap JumboTron modules, and you can commoditize FALD just fine, with no Hz limitations. It's only a matter of time -- other tech may arrive prior (OLED? Direct-view MicroLED? Blue-phase microsecond-GtG LCDs? Etc), but the fact remains is that there are multiple cheap tech paths to nits+Hz.

We never needed high Hz because Hz didn't help CRT as much, because CRT was zero blur already (due to flicker strobe effect -- low persistence). But the only way to achieve low-persistence flickerlessly is ultra-framerate + ultra-Hz, and the VR pushes tech progress on trying to pass the reality test (real life equalling VR) is also putting upward pressure on refresh rates, now that it has finally become technology possible to do strobeless low-persistence (at least in the laboratory). Since the humankind benefits are there, it's only a matter of time when it becomes cheap enough to include such refresh rates in screen technology (much like how 4K and retina no longer cost much above low-resolution).

Such a display can in theory (buy increasingly experimentally proven, bit by bit) temporally emulate the look-feel of the refreshing pattern of any legacy display (within human vision integration timescales). Want the display to emulate the look of a plasma screen (including christmas tree effects and countouring?) Want the display to emulate the look of a CRT tube (including zero blur + phosphorfeel + shadowmask)? A retina-Hz retina-rez retina-HDR display becomes a venn diagram big enough to emulate past displays. 1000Hz may not be enough for accurate emulation of any legacy display, but it will begin passing a lot of A/B blind-tests for a lot of people (flat-tube CRT versus electron-gun-emulated ultra-Hz+Rez+HDR digital display).

We've got multiple humankind technology-benefits progress that will eventually force ultrahigh-Hz and ultrahigh-HDR and ultrahigh-rez simultaneously converge -- the recipie necessary for an accurate CRT electron gun emulator.

I talk to many researchers in the refresh rate equivalent of 1990s 4K research or 1980s Japan HDTV researchers. Sure, 1950s and 1960s predicted wall-hangable TVs by 1970s, and it took a lot longer, but it wasn't never. We see the same situation here. Blur Busters prime directive is to pull the needle as early as possible, as cheaply as possible.

For those unaware, Blur Busters is two parts --
...Blur Busters Media writes simplified stuff in Popular Science style to mythbust refresh rate race (descendants of 30fps-vs-60fps myths)
...Blur Busters Laboratory does the more advanced stuff like contracts with manufacturers, and I have to visit a lot of conventions (pre-COVID anyway), so I get to see the future of displays much earlier than many. Our Area 51 Display Research Articles on BlurBusters.com contains a lot of content that are now textbook reading at NVIDIA and other places, since they are easier Popular Science article formattings of boring advanced scientific papers, helping to push the needle of the refresh rate race.

Have I thrown down the microphone hard enough yet? I have more... :D

</Future Screen Technology Discussion>
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

thatoneguy
Posts: 181
Joined: 06 Aug 2015, 17:16

Re: RetroArch - Variable BFI Mod for 180Hz/240Hz/Etc

Post by thatoneguy » 17 Sep 2020, 07:40

MicroLED is the golden child of display technology.
Speaking of it, I wonder why these displays like Crystal LED and Samsung's Wall are limited to the 1000-2000 nit range?
If you know anything I would like to be enlightened about it.

Also a bit off-topic but speaking about theoretical displays...I wonder what's the progress on the ever so elusive Holographic technology?
Always thought that would be CRT perfected in the far future...but I doubt I'll live to see it.
With Holographic technology we could display pixels of any shape. It would be backwards compatible with so much old antiquated content that used non-standard pixel aspect ratios. Of course because of that it would also be able to display any resolution natively too.
But the biggest thing would be it being completely agnostic when it comes to display aspect ratios and allowing so much more freedom because of that.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: RetroArch - Variable BFI Mod for 180Hz/240Hz/Etc

Post by Chief Blur Buster » 17 Sep 2020, 12:36

<Future Screen Technology Discussion>
thatoneguy wrote:
17 Sep 2020, 07:40
Speaking of it, I wonder why these displays like Crystal LED and Samsung's Wall are limited to the 1000-2000 nit range? If you know anything I would like to be enlightened about it.
I think they're being conservative to maximize wear-tear. If you have seen a very aged, very old JumboTron, you'll notice that they have uneven wear (dimmer modules, tinted squares, dead LEDs, etc). Overdriving LEDs for sunlight visibility requires humongous nits, and running such a screen 24/7 creates such wear patterns.

They want THE WALL to last a long time, so 2000 nits is actually quite reasonable for now. But ultimately, there's a wide envelope of nit headroom from dim to superbright -- LEDs are now used to light up stadiums and projectors, so there's really no real limit to how bright they can be, but it's a cost/longevity/electronics tradeoff.

Theoretically, even though LED is efficient, it's possible that a 10,000 nits "wall" may use more power at peak brightness than a common electricity circuit can provide, so it's also possibly a power-supply-side issue they keep within the wattage budget for. But a desktop wouldn't have this problem.

Currently, Samsung's THE WALL is now a production display currently being installed in high end offices/buildings/museums/mansions (it's custom built to order -- size you want; since it's modular like a conventional JumboTron, but with LEDs miniaturized to "retina" resolution at ~10 foot viewing distance), while the 10,000-nit technologies are still prototype.

Technically they can make THE WALL brighter already, but power/cost/wear/etc are concerns for extra nits, and this is just the first one of the commercialized "MicroLED" display. The lab prototype MicroLEDs are even brighter, there's even a 2 million nit MicroLED designed to be illuminated in a more continuous manner as long as it is heatsinked/water-cooled -- those would be great for projectors. (The projector bulb is the screen, and the screen is the projector bulb!).

Two million nits for MicroLED! (At least for the tiny ones.) So MicroLED can already go practically sun-bright, given sufficient cooling. However, heat/power/cooling/consistency/etc are clearly engineering challenges at really high power levels on really small screens, though that will improve over time, I'd imagine.

2000 nits is plenty bright already, 10,000nit is mainly useful for peak HDR brightness capability, but it would be great to commandeer that for a software-based CRT electron gun emulator at the sub-refresh-cycle granularity.
thatoneguy wrote:
17 Sep 2020, 07:40
Also a bit off-topic but speaking about theoretical displays...I wonder what's the progress on the ever so elusive Holographic technology?
Always thought that would be CRT perfected in the far future...but I doubt I'll live to see it.
With Holographic technology we could display pixels of any shape. It would be backwards compatible with so much old antiquated content that used non-standard pixel aspect ratios. Of course because of that it would also be able to display any resolution natively too.
But the biggest thing would be it being completely agnostic when it comes to display aspect ratios and allowing so much more freedom because of that.
There isn't much on the lab radar when it comes to holographic displays for commodity uses (phones, tablets, monitors, laptops, televisions, etc) but there is certainly development for speciality uses.

There are cons that go with the pros, including how high-resolution the holographic display can become, and how to properly refresh the holographic display quickly, and what techniques of holography or "holography" (like Looking Glass, which aren't real "holography" from a laser parlance) is used to generate the holographic screen, whether for 2D virtues (resolutionless flexibilty, etc) or 3D virtues, etc. All of them had some cons, such as distortions off-axis or low resolution that betrays the screen technology, or laser speckle that's hard to despecke, or plain lenticular-lens parallax artifacts (if done that way), or other limitations. They're great for various purposes but isn't easily able to handle high-Hz retina 2D planar tasks nearly as well as OLED and even LCD currently can.

I consider holographic screens a niche purpose / specialized tech for the forseeable future. This could change later, but nothing appears in the 10-year-radar (unlike 1000Hz LCDs and 1000Hz OLEDs / MicroLEDs, of which engineering paths now exist for).

</Future Screen Technology Discussion>
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

ophidon
Posts: 14
Joined: 18 Mar 2014, 07:57

Re: RetroArch - Variable BFI Mod for 180Hz/240Hz/Etc

Post by ophidon » 17 Sep 2020, 22:33

Good news! I have reliably and repeatedly tracked down a major culprit of the intermittent flashing on my system, and now can run for an hour without a single flash, and it wasn't Retroarch itself causing it.

MSI Afterburner was causing it. Not RTSS which I would have understood, with its serious overlay/hooks. But no, just Afterburner proper with no enabled overlay, and with power monitoring off (which has been known for years already to cause microstuttering). I have verified that this isn't placebo for my system with multiple reboots and turning afterburner back on and off and watching the issue correctly reappear and disappear.

This isn't something outside of this scenario that whatever performance impact Afterburner is having would be noticeable. At an actual 60 fps instead of what is essentially still 180 fps, the odds of causing a frame skip would be significantly lower. And also for modern high hz games where people tend to vastly prefer gsync over BFI, it would cause an imperceptible ~.5ms delay in the next frame every minute or so (and no flash).

I would never say this would be a 100% fix, as any stuttering that -is- caused by Retroarch will still cause the same issue... but with a moderately strong modern system, I can now assure you it is solvable for less strenuous cores. Most of my testing has been with nes and snes cores, and it seems plausible psx or other strenuous cores would have more likely chance to cause actual Retroarch stuttering, but at the same time, a majority of games for those systems actually run at 30fps where 60hz BFI is nearly useless anyway.

Anyway, I am continuing to clean up the code and rebase for a push to retroarch proper, hopefully it will be done this weekend. As of now I consider this almost a flat out superior solution for retro 60hz gaming over hardware strobing (as it exists on current screens that can be forced to strobe at 60hz).

thatoneguy
Posts: 181
Joined: 06 Aug 2015, 17:16

Re: RetroArch - Variable BFI Mod for 180Hz/240Hz/Etc

Post by thatoneguy » 18 Sep 2020, 08:56

I knew about that 2 million nit thing. I keep up with MLED info overall.
It's mostly being considered for AR and Mobile Screens for obvious reasons but I do hope it happens in consumer screens in the future because so much content out there is low framerate so having that head room would be helpful for eliminating blur via strobing or bfi(especially for HDR Films and stuff).

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: RetroArch - Variable BFI Mod for 180Hz/240Hz/Etc

Post by Chief Blur Buster » 18 Sep 2020, 12:33

ophidon wrote:
17 Sep 2020, 22:33
Good news! I have reliably and repeatedly tracked down a major culprit of the intermittent flashing on my system, and now can run for an hour without a single flash, and it wasn't Retroarch itself causing it.
Congratulations! Solving the weak links is the most important move.

As for G-SYNC, 0.5ms stutters are generally imperceptible at current commodity refresh rates (e.g. 6.94ms persistence for a 144Hz sample-and-hold display), but I've heard that some emulator modules stutter WAY more than others, and I have heard of G-SYNC stutter complaints in RetroArch...

<Aside>
Theoretically, 0.5ms stutter can become visible on 1000Hz+ (preferably 2000Hz+) retina resolution displays (4000 pixels/sec with 2ms Present()-to-Photons error = 2 pixel stutterjump). Once refresh duration falls near or below stutter error margin, the stutter now goes above human visibility noisefloor. So shorten the refresh duration, as a whac-a-mole manoever in this refresh rate to retina refresh rates. At high refresh rates, G-SYNC becomes unnecessary since the tiny refresh granularities provides a nice per-pixel VRR effect. You can play 24fps, 25fps, 50fps, 59.94fps, 60fps, pretty much stutter-free in simultaneous windows on an ultra-Hz (1000Hz+) display since blur from frame persistence is vastly bigger than the stutter/judder amplitudes at such framerate:refreshrate ratios. More useful reading in Milliseconds Matters thread, where some milliseconds matters and others do not.
</Aside>
ophidon wrote:
17 Sep 2020, 22:33
As of now I consider this almost a flat out superior solution for retro 60hz gaming over hardware strobing (as it exists on current screens that can be forced to strobe at 60hz).
This is true for some 240Hz screens but not all of them. There are some ultra high quality 60 Hz single strobe that does pretty much the same apperance as high quality software BFI (no color degradation, no brightness-loss difference, no inversion artifacts), and sometimes the hardware strobing is vastly superior. On others it's identical (OLED BFI looked identical for software BFI and hardware BFI on the LG 4K CX HDTV). If you haven't seen many displays, don't necessarily generalize. ;)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

ophidon
Posts: 14
Joined: 18 Mar 2014, 07:57

Re: RetroArch - Variable BFI Mod for 180Hz/240Hz/Etc

Post by ophidon » 18 Sep 2020, 13:00

The most important difference for me between this software bfi and hardware bfi at 60hz (of which I have 2 screens that can do, one benq and one with the ulmb hack), is that the 60hz flicker is -much- less straining to my eyes with software bfi. With the hardware 60hz strobe I have to use it in a pitch black room or I'll get a headache in short time, with this software bfi I can play in a fully lit room during the day with no issues. I presume, but I'm sure you can correct me if I'm wrong, that this is because the total luminance difference is smaller between on and off frames when the backlight stays constant.

This wouldn't be true for a self-emissive screen like oled of course, but until microled or whatever else comes to market, I would highly recommend against anyone playing significant hours worth of pillarboxed 4:3 retro games on an oled anyway. More than just the static UI elements, the burn-in from the pillars would become atrocious.

The lack of any strobe crosstalk at all, and being able to function on screens that dont have any hardware bfi support, are the other big advantages of course. And whatever small clarity is lost vs a hardware strobe of the same length as this equivalent amount of software bfi, I'm sure is detectable by a pursuit camera, but it isn't to my eyes while in game.

Also, you may want to be aware that the Blur Busters Approved ViewSonic XG270 has practically no strobe crosstlak for low-Hz hardware strobing, since a 240Hz screen can be calibrated to have near-zero crosstalk for low Hz hardware strobing.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: RetroArch - Variable BFI Mod for 180Hz/240Hz/Etc

Post by Chief Blur Buster » 18 Sep 2020, 13:15

ophidon wrote:
18 Sep 2020, 13:00
The most important difference for me between this software bfi and hardware bfi at 60hz (of which I have 2 screens that can do, one benq and one with the ulmb hack), is that the 60hz flicker is -much- less straining to my eyes with software bfi.
In terms of physics, it is more global BFI versus rolling BFI. Not all pixels refresh at the same time, as seen in high speed videos of screen refreshing.

Motion blur reduction on commodity LCDs use a global impulse

OLED does hardware-based rolling BFI so the strain is identical, since there's no GtG difference or rolling scan difference.
ophidon wrote:
18 Sep 2020, 13:00
This wouldn't be true for a self-emissive screen like oled of course, but until microled or whatever else comes to market, I would highly recommend against anyone playing significant hours worth of pillarboxed 4:3 retro games on an oled anyway. More than just the static UI elements, the burn-in from the pillars would become atrocious.
It is true that rolling-impulse BFI is easier to achieve hardware-based on a self-emission display.

However, one can use a FALD backlight as a hardware scanning backlight to generate a rolling-BFI effect, if the monitor was designed as such. And an OLED BFI on many OLEDs (such as the LG CX OLED) is already a rolling-impulse BFI.

But you are right, global BFI (like traditional ULMB) has more eyestrain than rolling BFI (which software-based BFI will end up producing, because of the LCD scanout behavior)
ophidon wrote:
18 Sep 2020, 13:00
The lack of any strobe crosstalk at all, and being able to function on screens that dont have any hardware bfi support, are the other big advantages of course. And whatever small clarity is lost vs a hardware strobe of the same length as this equivalent amount of software bfi, I'm sure is detectable by a pursuit camera, but it isn't to my eyes while in game.
That's simply because of low resolution of emulator games. It takes really high resolutions to see low MPRT differences. For example, TestUFO Paning Map Test. Try turning on ULMB with that. Now adjust "ULMB Pulse Width" to 30. Now you can see a human-visible difference of 0.5ms MPRT and 1.0ms MPRT. This is visible with some FPS games when you use VSYNC ON framerate=Hz (to eliminate microstutters) if you're using a 3200dpi mouse (at ultra-low ingame sensitivity settings) with turnspeeds roughly equalling a screenwidth per second, in a game of at least 2000 pixels wide. (0.5ms at 2000 pixels/sec is 1 pixel of motion blur, and 1ms means 2 pixels of motion blur).

But emulator games are low resolutions, like 320x200 rather than 2000 pixels wide, so low MPRT is unimportant in most emulator games. The point is that low MPRT is more important at higher resolutions than at lower resolutions.

This is true that most low resolution arcade games are just perfectly fine at 4ms MPRT (via 240hz Software BFI 25:75% ON:OFF duty cycle) or 5.5ms MPRT (via 180Hz software BFI 33%:66% ON:OFF duty cycle). It's a massive upgrade, while preserving a partial rolling-scan effect.

For 33%:66% duty cycle (1 frame + 2 black frames) -- 1/180sec scanouts means for 2/180sec something is illuminated and 1/180sec is comlete darkness. But add some 1ms-2ms GtG fuzz to those boundaries, and it's more probably approximately like 2.5/180sec something is illumnated and 0.5/180sec nothing is illuminated (apporx). This is because not all pixels refresh at the same time, as seen in high speed videos www.blurbusters.com/scanout

Now, going to 240Hz, 300Hz, 360Hz, the scanout-effect will become too fast/too brief in some ways to help flicker eyestrain (if GtG becomes too fast), but then the BFI sequence (example "1,0.5,0,0,0,0" for a 360Hz monitor, can help mitigate some of that, while brightening the image again too.

TL;DR: Software BFI does indeed have eye-friendliness advantages over global-impulse backlights in terms of flicker, due to the rolling-scan of LCD refresh, combined with flicker softening via GtG fade-in and fade-out.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: RetroArch - Variable BFI Mod for 180Hz/240Hz/Etc

Post by Chief Blur Buster » 18 Sep 2020, 13:38

P.S. Hopefully you can consider the BFI-string technique at least optionally? And make it Hz adaptive.

One could also just pad 0's for refresh rates requiring longer strings, and trunctate the string for refresh rates requiring shorter strings. So one could just specify "1, 0.5, 0.25" in a configuration file to do automatically "1, 0.5" for 120Hz. And "1, 0.5, 0.25" for 180Hz. And "1, 0.5, 0.25, 0" for 240Hz. Then you wouldn't have to have separate strings for different refresh rates, the user would just specify ONE preferred universal string.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

ophidon
Posts: 14
Joined: 18 Mar 2014, 07:57

Re: RetroArch - Variable BFI Mod for 180Hz/240Hz/Etc

Post by ophidon » 18 Sep 2020, 13:51

Yes, I am not against the idea at all. I never have an issue with more options existing than I would personally take advantage of. The comma seperated list doesn't exactly match how Retroarch seems to have designed its menu options to be programmed, but you could still easily do a predetermined set of patterns without a major rewrite. It's just I wanted to get this version out first, before considering additions in case life gets too busy soon (which is possible as we have a new puppy arriving soonish, to train).

Post Reply