RetroArch - Variable BFI Mod for 180Hz/240Hz/Etc

Talk to software developers and aspiring geeks. Programming tips. Improve motion fluidity. Reduce input lag. Come Present() yourself!
User avatar
Chief Blur Buster
Site Admin
Posts: 7857
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada
Contact:

Re: RetroArch - Variable BFI Mod for 180Hz/240Hz/Etc

Post by Chief Blur Buster » 16 Sep 2020, 15:49

<Future Screen Technology Discussion>
ophidon wrote:
16 Sep 2020, 13:45
As for 1000hz screens in the future, as long as they don't remove the ability to run at 'low' Hz like 180 or 240, the issue should remain resolved if it can be with current technology. While I would certainly like to see a 1000 Hz display that could be bright enough at a 1/1000, or even 10/1000 frame pulses for amazing clarity, I remain doubtful that will -ever- exist. :D
Just as Blur Busters love to microphone-drop a lot of outdated 30fps-vs-60fps myths -- that is the wrong perspective (like Newtonian thinking versus proper Einsteinian thinking); because of multiple reasons.

(A) We don't necessarily assume single-Hz granularity.
1000Hz can also mean configurable phosphor-fade persistence, like configuring to 1ms phosphor fade, 1.5ms phosphor fade, 2ms phosphor fade, 2.3ms phosphor fade, simply by having a chasing-rolling-scan behavior of faded alpha-blend-to-black chasing behind the rolling scan. The more Hz we have, the more accurately we can emulate the progress of phosphor decay in a CRT electron gun emulator (Which will make things brighter)

(B) HDR support brings the necessary brightness needed.
I hve seen the Sony 10,000 nit prototype Display at CES. It was an LCD. It was amazing, it wasn't too bright, but used to make highlights brighter (streetlamps at night, neon lights at night, sun glint reflections off a chromed car). Those looked whiter than white. The headroom of 10,000 nit still leaves 625 nits for 60Hz with 1ms phosphor decay. Since HDR is limited to only a few pixels per refresh cycle, the fact we can use a rolling-scan bar, means we can satisfy the "HDR budget" simply by only brightly illuminating a few pixels at a time. CRT phosphor is similarly bright (5-figure nits) so HDR is the knight in shining armour to rescue the CRT emulation problem by 2030+. LED capable of 10,000nit is pretty cheap, the problem is turning that into a FALD-backlight in a three-figured priced gaming monitor. But FALD is no problem if you are willing to pay four figures for a display. Let's not exclude four-figure displays in the quote of "I remain doubtful that will ever excist".

(C) FALD innovations and MicroLED innovations
I've seen FALD LCDs with superior contrast and colors over OLED before; they actually exist. The magic is a sufficiently high-resolution FALD (full array local dimming) that has so many LED elements. Plus MicroLED display are easily capable of simultaneously "super bright HDR" + "super high Hz", with the right driving electronics.

As a person who have seen multiple 1000Hz through 2880Hz displays in prototype laboratories, as well as knowing that LED ribbons (300-LED) have become cheap on Alibaba for under $10 Example 1500-nit brightness 32x32 screen for only EIGHT DOLLARS -- it's a jumbotron module designed to be built into big stadium screens, but hobbyists use them as miniature low-resolution screens too. Some of them cheaply reach 10,000+ nits. There is a time coming where low-resolution MicroLED sheets (e.g. 1000-pixel screens to 10,000-pixel screens) is going to fall below $100. (Even JumboTron 32x32 modules are now only a few dollars off Alibaba). And did you know that jumbotron module internally refreshes at 1920 hertz! For only 8 dollars! If we swapped the frame buffer chip (60Hz) with a custom FPGA, the eight dollar JumboTron module becomes a low-resolution 1920 Hertz screen! Eventually we have JumboTron-superbright FALD monochrome sheets available, which can be commandeered to an ultrabright FALD. Just a few modifications and slap them behind a commodity LCD, and you've got a relatively kick-ass FALD. Right now FALD is the territory of 4-figure and 5-figure priced screens that match or beat OLED quality, so most people don't believe LCD can become as bright and as contrasty -- which is a shame, because there's an amazing amount of headroom available in LCD left.

Now, did you know direct-view MicroLED is simply a miniaturized JumboTron shrunk to retina resolution for desktop sized/wall sized use? And can blow past 10,000 nits?

All the tech exists, just have not been glued together yet into an integrated solution, and will require a beefy GPU with frame rate amplification technology. But ultimately, ultrahigh-Hz combined with ultra-HDR, is not unobtainiumly expensive. 4K was a $10,000 lab curiuosity in 2000, now it's a $299 Walmart special. Ultrahigh-Hz and HDR doesn't necessarily need to be impossible stuff, especially with the slow mainstreamification of 120Hz (new smartphones, tablets, etc). DLP chips are already refreshing at 1440Hz-2880Hz today, using 1-bit temporal dithering to generate low-Hz full color, and are relatively cheap, but aren't suitable for desktop monitors. If you haven't seen thousands of prototype/experimental displays on convention and corporate trips like I have, then judgements of "impossible" are likely premature. ;)

The tech filters down really slowly, but so did 4K (over 20 years). Now you see Apple/Samsung working to mainstream 120Hz this decade, and all the 120Hz initiatives going on (NHK 8K 120Hz broadcasts of Tokyo Olympics). 240Hz might even mainstream by 2040s, while 1000Hz remains an enthusiast curiousity -- as long as there are humankind benefits, if 240Hz can be done for only $3 to $5 extra with no impact on battery life, for 4x clearer browser scrolling, Apple/Samsung will do it for example. It's a matter of economics of the falling cost of high Hz. 2030? 2040? 2050? But it isn't "Never". For mainstream 240Hz. Who knows? What this does, is push the bleeding edge up further (like 1000Hz ultra-HDR displays), the pressures of technology progress, which solves our CRT-electron-beam-emulator problem.

I'll leave that as a Micdrop.

I see the canary in the coal mines of the emerging technology of the future -- so never say never.

It may be a while before HDR converges with ultra-Hz, but it will happen before 2100. Perhaps by 2025. Or 2030. I'm guessing 2030-ish. But it is not equivalent to "Never". There are quite a lot of engineering paths towards a cheap 1000Hz that also includes HDR -- but probably will take approximately a decade-ish (give or take, could be 2025 or 2040).

The "never will happen" stuff tends to be Newtonian thinking around Blur Busters -- winky wink. ;)

This is the type of stuff Blur Busters mythbusts -- Blur Busters mission also includes to stop the 1000Hz laughing relatively quickly, with articles such as Blur Busters Law: The Amazing Journey To Future 1000 Hz Displays. Even the diminished 240Hz-vs-360Hz difference (from LinusTechTips) is partially from GtG limitations, combined with LinusTechTips testing methods -- GtG limitations lagging behind Hz innovations is much like how the first 240Hz monitor from 2017 was more ghostly motion-blurry than today's more-refined 240Hz monitors.

Most of the mainstream masses are spoiled by the mostly unchanged look of desktop LCDs from 2005-2020. From the first Samsung 2ms TN 1080p/1200p displays today's 1ms 1080p 240Hz displays, they are all similar contrast ratios of about 1000:1, similiar brightnesses of 200-to-400nits, and with similar color gamut (~72% NTSC) even when transitioning from CCFL to the first LED backlights. If you're one of the masses spoiled by this, you ain't seen anything yet -- have you ever seen a desktop monitor driven by a 10,000-LED backlight? Totally different ballpark, sometimes better than OLED. It looks like an OLED, sometimes better, and the brightest whites exceed >1000nits. With tens of thousands of LEDs in a locally dimmed backlight, the blooming is gone, smaller than the bloom around a CRT phosphor dot.

Eventually, monochrome FALD MicroLED sheets will be machine-manufactured mass manufacture stuff -- since 10,000-LED monochrome MicroLED displays are much cheaper than 1080p direct-view MicroLED displays / JumboTrons -- so ultra high quality FALD LCDs in the sub-$1000 price range will arrive before direct-view full-color RGB FALD MicroLED displays (ala miniaturized JumboTron, or miniaturized desktop-sized Samsung "THE WALL" display panels). So expect nit-rocket FALD desktop monitors before then. Just set up a factory to manufacture FALD sheets as cheaply as cheap JumboTron modules, and you can commoditize FALD just fine, with no Hz limitations. It's only a matter of time -- other tech may arrive prior (OLED? Direct-view MicroLED? Blue-phase microsecond-GtG LCDs? Etc), but the fact remains is that there are multiple cheap tech paths to nits+Hz.

We never needed high Hz because Hz didn't help CRT as much, because CRT was zero blur already (due to flicker strobe effect -- low persistence). But the only way to achieve low-persistence flickerlessly is ultra-framerate + ultra-Hz, and the VR pushes tech progress on trying to pass the reality test (real life equalling VR) is also putting upward pressure on refresh rates, now that it has finally become technology possible to do strobeless low-persistence (at least in the laboratory). Since the humankind benefits are there, it's only a matter of time when it becomes cheap enough to include such refresh rates in screen technology (much like how 4K and retina no longer cost much above low-resolution).

Such a display can in theory (buy increasingly experimentally proven, bit by bit) temporally emulate the look-feel of the refreshing pattern of any legacy display (within human vision integration timescales). Want the display to emulate the look of a plasma screen (including christmas tree effects and countouring?) Want the display to emulate the look of a CRT tube (including zero blur + phosphorfeel + shadowmask)? A retina-Hz retina-rez retina-HDR display becomes a venn diagram big enough to emulate past displays. 1000Hz may not be enough for accurate emulation of any legacy display, but it will begin passing a lot of A/B blind-tests for a lot of people (flat-tube CRT versus electron-gun-emulated ultra-Hz+Rez+HDR digital display).

We've got multiple humankind technology-benefits progress that will eventually force ultrahigh-Hz and ultrahigh-HDR and ultrahigh-rez simultaneously converge -- the recipie necessary for an accurate CRT electron gun emulator.

I talk to many researchers in the refresh rate equivalent of 1990s 4K research or 1980s Japan HDTV researchers. Sure, 1950s and 1960s predicted wall-hangable TVs by 1970s, and it took a lot longer, but it wasn't never. We see the same situation here. Blur Busters prime directive is to pull the needle as early as possible, as cheaply as possible.

For those unaware, Blur Busters is two parts --
...Blur Busters Media writes simplified stuff in Popular Science style to mythbust refresh rate race (descendants of 30fps-vs-60fps myths)
...Blur Busters Laboratory does the more advanced stuff like contracts with manufacturers, and I have to visit a lot of conventions (pre-COVID anyway), so I get to see the future of displays much earlier than many. Our Area 51 Display Research Articles on BlurBusters.com contains a lot of content that are now textbook reading at NVIDIA and other places, since they are easier Popular Science article formattings of boring advanced scientific papers, helping to push the needle of the refresh rate race.

Have I thrown down the microphone hard enough yet? I have more... :D

</Future Screen Technology Discussion>
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

thatoneguy
Posts: 48
Joined: 06 Aug 2015, 17:16

Re: RetroArch - Variable BFI Mod for 180Hz/240Hz/Etc

Post by thatoneguy » Yesterday, 07:40

MicroLED is the golden child of display technology.
Speaking of it, I wonder why these displays like Crystal LED and Samsung's Wall are limited to the 1000-2000 nit range?
If you know anything I would like to be enlightened about it.

Also a bit off-topic but speaking about theoretical displays...I wonder what's the progress on the ever so elusive Holographic technology?
Always thought that would be CRT perfected in the far future...but I doubt I'll live to see it.
With Holographic technology we could display pixels of any shape. It would be backwards compatible with so much old antiquated content that used non-standard pixel aspect ratios. Of course because of that it would also be able to display any resolution natively too.
But the biggest thing would be it being completely agnostic when it comes to display aspect ratios and allowing so much more freedom because of that.

User avatar
Chief Blur Buster
Site Admin
Posts: 7857
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada
Contact:

Re: RetroArch - Variable BFI Mod for 180Hz/240Hz/Etc

Post by Chief Blur Buster » Yesterday, 12:36

<Future Screen Technology Discussion>
thatoneguy wrote:
Yesterday, 07:40
Speaking of it, I wonder why these displays like Crystal LED and Samsung's Wall are limited to the 1000-2000 nit range? If you know anything I would like to be enlightened about it.
I think they're being conservative to maximize wear-tear. If you have seen a very aged, very old JumboTron, you'll notice that they have uneven wear (dimmer modules, tinted squares, dead LEDs, etc). Overdriving LEDs for sunlight visibility requires humongous nits, and running such a screen 24/7 creates such wear patterns.

They want THE WALL to last a long time, so 2000 nits is actually quite reasonable for now. But ultimately, there's a wide envelope of nit headroom from dim to superbright -- LEDs are now used to light up stadiums and projectors, so there's really no real limit to how bright they can be, but it's a cost/longevity/electronics tradeoff.

Theoretically, even though LED is efficient, it's possible that a 10,000 nits "wall" may use more power at peak brightness than a common electricity circuit can provide, so it's also possibly a power-supply-side issue they keep within the wattage budget for. But a desktop wouldn't have this problem.

Currently, Samsung's THE WALL is now a production display currently being installed in high end offices/buildings/museums/mansions (it's custom built to order -- size you want; since it's modular like a conventional JumboTron, but with LEDs miniaturized to "retina" resolution at ~10 foot viewing distance), while the 10,000-nit technologies are still prototype.

Technically they can make THE WALL brighter already, but power/cost/wear/etc are concerns for extra nits, and this is just the first one of the commercialized "MicroLED" display. The lab prototype MicroLEDs are even brighter, there's even a 2 million nit MicroLED designed to be illuminated in a more continuous manner as long as it is heatsinked/water-cooled -- those would be great for projectors. (The projector bulb is the screen, and the screen is the projector bulb!).

Two million nits for MicroLED! (At least for the tiny ones.) So MicroLED can already go practically sun-bright, given sufficient cooling. However, heat/power/cooling/consistency/etc are clearly engineering challenges at really high power levels on really small screens, though that will improve over time, I'd imagine.

2000 nits is plenty bright already, 10,000nit is mainly useful for peak HDR brightness capability, but it would be great to commandeer that for a software-based CRT electron gun emulator at the sub-refresh-cycle granularity.
thatoneguy wrote:
Yesterday, 07:40
Also a bit off-topic but speaking about theoretical displays...I wonder what's the progress on the ever so elusive Holographic technology?
Always thought that would be CRT perfected in the far future...but I doubt I'll live to see it.
With Holographic technology we could display pixels of any shape. It would be backwards compatible with so much old antiquated content that used non-standard pixel aspect ratios. Of course because of that it would also be able to display any resolution natively too.
But the biggest thing would be it being completely agnostic when it comes to display aspect ratios and allowing so much more freedom because of that.
There isn't much on the lab radar when it comes to holographic displays for commodity uses (phones, tablets, monitors, laptops, televisions, etc) but there is certainly development for speciality uses.

There are cons that go with the pros, including how high-resolution the holographic display can become, and how to properly refresh the holographic display quickly, and what techniques of holography or "holography" (like Looking Glass, which aren't real "holography" from a laser parlance) is used to generate the holographic screen, whether for 2D virtues (resolutionless flexibilty, etc) or 3D virtues, etc. All of them had some cons, such as distortions off-axis or low resolution that betrays the screen technology, or laser speckle that's hard to despecke, or plain lenticular-lens parallax artifacts (if done that way), or other limitations. They're great for various purposes but isn't easily able to handle high-Hz retina 2D planar tasks nearly as well as OLED and even LCD currently can.

I consider holographic screens a niche purpose / specialized tech for the forseeable future. This could change later, but nothing appears in the 10-year-radar (unlike 1000Hz LCDs and 1000Hz OLEDs / MicroLEDs, of which engineering paths now exist for).

</Future Screen Technology Discussion>
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

ophidon
Posts: 8
Joined: 18 Mar 2014, 07:57

Re: RetroArch - Variable BFI Mod for 180Hz/240Hz/Etc

Post by ophidon » Yesterday, 22:33

Good news! I have reliably and repeatedly tracked down a major culprit of the intermittent flashing on my system, and now can run for an hour without a single flash, and it wasn't Retroarch itself causing it.

MSI Afterburner was causing it. Not RTSS which I would have understood, with its serious overlay/hooks. But no, just Afterburner proper with no enabled overlay, and with power monitoring off (which has been known for years already to cause microstuttering). I have verified that this isn't placebo for my system with multiple reboots and turning afterburner back on and off and watching the issue correctly reappear and disappear.

This isn't something outside of this scenario that whatever performance impact Afterburner is having would be noticeable. At an actual 60 fps instead of what is essentially still 180 fps, the odds of causing a frame skip would be significantly lower. And also for modern high hz games where people tend to vastly prefer gsync over BFI, it would cause an imperceptible ~.5ms delay in the next frame every minute or so (and no flash).

I would never say this would be a 100% fix, as any stuttering that -is- caused by Retroarch will still cause the same issue... but with a moderately strong modern system, I can now assure you it is solvable for less strenuous cores. Most of my testing has been with nes and snes cores, and it seems plausible psx or other strenuous cores would have more likely chance to cause actual Retroarch stuttering, but at the same time, a majority of games for those systems actually run at 30fps where 60hz BFI is nearly useless anyway.

Anyway, I am continuing to clean up the code and rebase for a push to retroarch proper, hopefully it will be done this weekend. As of now I consider this almost a flat out superior solution for retro 60hz gaming over hardware strobing (as it exists on current screens that can be forced to strobe at 60hz).

Post Reply