Blur Busters Forums

Who you gonna call? The Blur Busters! For Everything Better Than 60Hz™ Skip to content

Will future strobing tech be suitable for retro gaming?

Ask about motion blur reduction in gaming monitors. Includes ULMB (Ultra Low Motion Blur), NVIDIA LightBoost, ASUS ELMB, BenQ/Zowie DyAc, Turbo240, ToastyX Strobelight, etc.

Will future strobing tech be suitable for retro gaming?

Postby Patrick » 08 May 2019, 12:01

Will future strobing tech be suitable for retro gaming?

I'm having a conversation with Umwelt on the Libretro Retroarch forums, and he had this to say:

While strobing tech can be expected to improve, I am less optimistic about its future suitability for retro gaming.

What we want to emulate CRT scan which helps clean up image retention from our eyes is strobing (distinct from BFI, which is an inferior solution) at 60Hz, to match the 60fps of most retro games. But it looks like manufacturers will instead be focusing on faster strobing, aimed at modern gaming, sports, movies, etc. Only a few older models did strobing at 60Hz (often only accessible through service menus or other unorthodox means). This may look too flickery for some people.

What we really want is a rolling scan strobe at 60Hz. I only know of a couple of earlier professional OLED monitors that have this feature, and I see no reason why manufacturers would have an incentive to develop and introduce such a feature on future consumer products. The “purist” (and by playing on non-CRT displays you would already not be much of a purist) retro gamer market is too small.


Now I'm worried. We need a way to display retro games properly so that they can be properly preserved. What are some answers to this? I'm assuming 120Hz + BFI will still be a viable solution, going forward, but is that correct? Since CRTs won't last forever and no new ones will ever be manufactured, will we one day find ourselves in a situation where there is no suitable display device for retro gaming? Or are my fears unfounded? Could a niche market develop for custom panels with the right specs? I want retro gaming to live forever!

EDIT: after doing some more reading, I'm now feeling pretty good about the future of display tech and it's suitability for retro gaming. My current understanding (correct me if I'm wrong) is that strobing and BFI are merely stopgap technologies on the road to 1000Hz displays, and that 480Hz without strobing or BFI is sufficient for blur-free 60fps. So in 5-10 years we'll have 480Hz displays, and in 10+ years we'll have 1000Hz displays. In the meantime, we'll make do with strobing and BFI at 120+ Hz.
Patrick
 
Posts: 10
Joined: 13 Mar 2019, 21:38

Re: Will future strobing tech be suitable for retro gaming?

Postby Blural » 11 May 2019, 12:53

Higher refresh rate displays are largely irrelevant to retro gaming as the game's framerate needs to match the display refresh rate to take advantage of the motion clarity benefit of the higher refresh rate. There are already LCD displays that will display "perfect" motion clarity by single strobing 60hz+60fps retro games but there should be more monitors that allow for strobing at this range. Unless I'm missing the point here I don't see what else you could be talking about. There are other reasons to use CRT displays that do not have to do with display persistence for retro games like scanlines but I'm not an expert on that subject. So basically all you need is a monitor that properly strobes at 60hz (CRT, BenQ xl2411z, etc) and you have "perfect" motion clarity. So yes your fears seem to be mostly unfounded. There's another issue, games that are locked at lower framerates like 30fps or 25fps. Strobing at low framerates such as these creates tons of flicker but that's true of CRT too and true of modern games. Motion interpolation is probably a better solution for extremely low framerates.
Blural
 
Posts: 26
Joined: 20 May 2014, 17:08

Re: Will future strobing tech be suitable for retro gaming?

Postby Chief Blur Buster » 11 May 2019, 13:23

An ultra-high-Hz display (Retina refresh rates) can preserve the temporal look of a CRT more accurately.

For example, a future 1000Hz display can actually realtime playback a 1000fps high-speed video (full dynamic range) of a CRT, and seeing all the temporal effects (flicker, rolling scan, CRT phosphor decay) being preserved practically to human vision limitations.

With retina refresh rate sample-and-hold to emulate a broad range of impulse-driven display technologies.
- Successive refresh cycles can move a rolling scan bar downwards (blended-edge too)
- Successive refresh cycles can fade sequentially (phosphor fade emulation)

So even if there's no more impulsed 60Hz displays, the use of "retina refresh rates" for sample-and-hold can also provide the ability to temporally emulate all kinds of impulsing methods (rolling scan emulation, phosphor decay emulation, etc)

Even the CRT skew effect (during eye rolling around) can be preservable with sufficiently high temporal resolution (quad-digit sample-and-hold refresh rates).

Today, low-Hz software BFI such as 60Hz BFI at 120Hz has poor color quality especially on TN panels only because of how it interferes with LCD Inversion algorithms. But OLED does not have the same kind of problem (although different issues can pop up). Emulator software BFI looks amazing on 120Hz OLEDs, unlike for 120Hz LCDs. Now add more refresh cycles to more accurately emulate nuances such as phosphor fade .... and more accurately emulate rolling scan behaviours of various different CRTs.... so much untapped potential! Hardware based strobing is superior but in some cases, the quality is identical between software and hardware. As long as each individual refresh cycle has full bit depth, this enables a lot more ways to temporally emulate all kinds of impulse-driven displays via brute-Hz sample-and-hold.

Today, we are using retina resolution to accurately spatially emulate the textures of a CRT surface. (e.g. MAME HLSL)

Tomorrow, we will use retina refresh rate (>1000Hz+ sample-hold display) to accurately temporally emulate various kinds of impulse-driven displays.

This may take years (e.g. 2030+) but a great article is Blur Busters Law: The Amazing Journey To Future 1000Hz+ Displays.

P.S. Can you link to that libretro thread? Linking to other forums is permitted in useful contexts like these.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6265
Joined: 05 Dec 2013, 15:44

Re: Will future strobing tech be suitable for retro gaming?

Postby Chief Blur Buster » 11 May 2019, 13:25

Blural wrote:Higher refresh rate displays are largely irrelevant to retro gaming as the game's framerate needs to match the display refresh rate to take advantage of the motion clarity benefit of the higher refresh rate.

Actually..... read above.

For the future --

Extra Hz can produce much more flexibility for BFI.
- Different BFI ratios such as ON:OFF:OFF:OFF for 60fps@240Hz software BFI (4 refresh cycles per emulator Hz)
- Phosphor fade emulation via 100%:25%:10%:2%:0%:0%:0%:0% for 60fps@480Hz (8 refresh cycles per emulator Hz)
- Blended fuzzy-edge rolling scan slices. Rolling scan emulation via software.
i.e. For 60fps at 480Hz, you can do refresh cycle #1 doing top 1/8th of emulator frame (rest of screen black), refresh cycle #2 has second 1/8th slice (rest of screen black), and so on.
- Etc, etc, etc, etc, etc, etc, etc, etc, etc, etc, etc

Once we reach about ~10 to ~50 refresh cycles per emulator Hz, we've effectively reached retina refresh rates (with the ability to even preserve the parallelogram-skew effect when rolling eyes around in front of a CRT display) -- every single temporal feel of a CRT is preservable with sheer brute Hz -- retina refresh rates can very accurately (to human vision margins) preserve a shocking number of temporal effects.

Sheer refresh rate provides some magical flexibility for emulating various kinds of impulse-driven displays. Ideally you want the retina refresh rate to be divisible by 60, but those sheer number of extra refresh cycles per refresh cycle, can provide some amazing CRT-emulation magic in a temporal-preserving way.

Early experiments show super amazing promise with my own eyes. Retina refresh rates will be amazing!
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6265
Joined: 05 Dec 2013, 15:44

Re: Will future strobing tech be suitable for retro gaming?

Postby Chief Blur Buster » 11 May 2019, 13:39

Oh -- and yes, indeed some people may prefer lagless interpolation or other Frame Rate Amplification Technology. Basically blurless without strobing. However, that's a user preference.

For faithful temporal preservation of a CRT behaviour, retina refresh rates is actually superior to strobing technology. Especially if you can combine retina HDR + retina resolution + retina refresh rates (plus correct emulator software implementation spatially AND temporally) ....

For "wow, I didn't know that display was not a CRT" for fast motion platformers when you put a cardboard cutout in front of such a monitor in an an A/B blind test between that 1000Hz+ digital display versus a flat tube.

For now, strobing is easier. We'll have to live with strobing initially. While the first retina refresh rate desktop gaming monitors may possibly arrive before the end of the next decade -- it may take longer than that we're able to simultaneously do retina resolution AND retina refresh rates simultaneously.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6265
Joined: 05 Dec 2013, 15:44

Re: Will future strobing tech be suitable for retro gaming?

Postby Patrick » 11 May 2019, 17:29

Chief Blur Buster wrote:...

Today, we are using retina resolution to accurately spatially emulate the textures of a CRT surface. (e.g. MAME HLSL)

Tomorrow, we will use retina refresh rate (>1000Hz+ sample-hold display) to accurately temporally emulate various kinds of impulse-driven displays.

This may take years (e.g. 2030+) but a great article is Blur Busters Law: The Amazing Journey To Future 1000Hz+ Displays.

P.S. Can you link to that libretro thread? Linking to other forums is permitted in useful contexts like these.


Awesome reply! It appears that we already have "good enough" solutions for reducing motion blur, and that better solutions are coming in the future.

Honestly, I'm quite satisfied with the motion clarity provided by simple black frame insertion and setting the refresh rate of the display in Hz to 2x the game's refresh rate in fps.

The thing that kills it for me is the massive reduction in brightness you get when you add scanlines on top of black frame insertion. BFI is a 50% reduction in brightness, and scanlines (blacking out every other line) are an additional 50% reduction, leaving you with a massive 75% reduction. If you want to do any kind of accurate RGB mask emulation on top of this, that's close to another 50% reduction in brightness, and now you're looking at close to a 90% total reduction in brightness to do CRT emulation (scanlines, mask, BFI). If you want the resulting image to be as bright as a CRT, it looks like you need over 1000 Nits max, or at least 500 Nits with BFI.

On the ASUS VG248QE, I can only get an acceptably bright image with BFI+scanlines by maxing out the backlight, contrast, and color channels through the monitor's OSD, then increasing saturation and luminance by 10% through software. Obviously this results in a huge amount of clipping, but it's not a problem in 99% of retro games, which rarely used the full range. I found that I can get a slightly brighter image (around 10%) by disabling lightboost altogether and just relying on BFI. I know this is an older display, and not representative of what current displays can do.

At this point, I'm more worried about the next 5-10 years than I am about 10+ years from now. As of right now, are there any displays that can get bright enough to do scanlines+BFI and/or strobing (75% reduction in brightness) while being as bright as a CRT and maintaining a decent contrast ratio? I think you'd need at least 500 Nits, and if you want to also emulate the RGB mask, the threshold is probably around 1000 Nits. Looking around, it looks like most gaming displays available these days are maxing out at around 400 Nits, which isn't a huge improvement from 5 years ago...

@Blural
@Chief Blur Buster

here's the link to the libretro post. It's a huge and meandering thread that jumps from topic to topic.

https://forums.libretro.com/t/please-sh ... /19193/565
Patrick
 
Posts: 10
Joined: 13 Mar 2019, 21:38

Re: Will future strobing tech be suitable for retro gaming?

Postby Chief Blur Buster » 11 May 2019, 19:36

Patrick wrote:The thing that kills it for me is the massive reduction in brightness you get when you add scanlines on top of black frame insertion. BFI is a 50% reduction in brightness, and scanlines (blacking out every other line) are an additional 50% reduction, leaving you with a massive 75% reduction.

Newer strobed monitors can reach >300 nit strobed, which is about 3 times brighter than ASUS VG248QE LightBoost. Such panels use voltage-boosted strobing. I've seen some strobed LCDs reach closer to 500 nits, though most are 300 nits. TFTCentral publishes the brightness of strobing, if you want to look it up.

For homebrew hardware modders -- theoretically, one can open up an LCD and add some extra voltage-boost to strobing, at the potential cost of wearing out the LEDs faster. It would be clever if someone comes up with a water-cooled edgelight hack for an existing LCD panel to push 5000 nits into an LCD for 500nit 90% persistence reduction, though it would be a cumbersome modification. LEDs can handle being pulsed brighter with a slight overvoltage to approximately 3x-4x brighter without meaningful loss of LED lifetime, thanks to the cooling the LEDs get between flashes. So some manufacturers have put voltage-boosted strobing in the 25-inch 240Hz LCDs which generally reach approximately 300 nits strobed.

Another trick is the BenQ XL2546 with the 180Hz custom mode, and using an ON:OFF:OFF software BFI to convert 180Hz strobing to 60Hz strobing. Since the 6-bit FRC temporal dithering algorithm is every other refresh cycle, converting 180Hz->60Hz strobing looks better color quality on TN panels than converting 120Hz->60Hz strobing, as it regains what looks essentially 8-bit full color depth as well as the elimination of fine-checkerboard inversion artifacts ( http://www.testufo.com/inversion ...) which is amplified during software BFI.

Now, software-based BFI is often a workaround to convert 120Hz hardware strobing to 60Hz hardware strobing, by using the software BFI to black-out every other strobe. It's annoying that many monitors don't strobe at anything less than 120Hz, because of arbitrary manufacturer fear of unlocking low-Hz strobing due to epileptic flicker.

Also, on some monitors software BFI on even numbered cadence (60fps@120Hz BFI) can cause some temporary image retention effects on some panels, particularly certain models of TN monitors, because of how software BFI bypasses the voltage-balancing inversion (positive/negative voltages in alternating refresh cycles as in http://www.testufo.com/inersion ...). We've published workarounds for this, and the TestUFO tests that flicker have automatic compensation for this. 60fps->180Hz BFI use an odd-numbered cadence which avoids this image retention behaviour of software-based BFI on certain models of monitors.

Alternatively, the BenQ XL2411P is one of the few monitors that natively single-strobes at 60Hz and does not require software BFI as a workaround to convert 120Hz hardware strobing to 60Hz hardware strobing.

Patrick wrote:At this point, I'm more worried about the next 5-10 years than I am about 10+ years from now. As of right now, are there any displays that can get bright enough to do scanlines+BFI and/or strobing (75% reduction in brightness) while being as bright as a CRT and maintaining a decent contrast ratio? I think you'd need at least 500 Nits, and if you want to also emulate the RGB mask, the threshold is probably around 1000 Nits. Looking around, it looks like most gaming displays available these days are maxing out at around 400 Nits, which isn't a huge improvement from 5 years ago...

Keep in mind that some of them use voltage-boosted strobing, where it's 400 nits non-strobed but still 300 nits strobed because of the overvoltage boosting that is automatically enabled during strobing.

Use TFTCentral and RTings reviews to determine how bright the strobing is at 120Hz, then divide that number by half for 60Hz.

So 300nit strobed 120Hz = 150nit strobed 60Hz when using software BFI to block every other hardware strobe (using software BFI to convert double-strobe to single-strobe).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6265
Joined: 05 Dec 2013, 15:44

Re: Will future strobing tech be suitable for retro gaming?

Postby Chief Blur Buster » 11 May 2019, 19:46

Now....

Emulation's Best Kept Secret:
Using 240Hz To Improve 60Hz Emulation


The VG248QE is a famous reference for software BFI.

But there's a new king. Blur Busters sometimes innovates in little-known emulator research such as beam raced synchronization of real raster with emulator raster.

Now we have a new easier-to-implement tip. One little best-kept secret for emulation is a 240Hz esports monitor such as XL2546. Very few emulator users realize what a gem 240Hz can be for emulator use.

Some 240Hz monitors have voltage-boosted strobing that is much brighter than many 144Hz monitors, the brightness loss is smaller (Even at lower Hz).

One big problem of software BFI on a 6-bit TN panel is that software BFI can amplify visibility of inversion artifacts:

Image
(Explanation here)

So, there's a way to eliminate this with software BFI if you use 240Hz monitor.

240Hz reduces input lag
240Hz has a natural input-lag-lowering effect (as long as you stay at 240Hz to do 60Hz emulation, don't use lower refresh rate)

240Hz can enable a software BFI cadence without inversion artifacts
240Hz can enable a software BFI cadence that has full 8-bit color quality
Avoid the bitdepth reduction side effect that often happens on TN 6-bit FRC panels. A 240Hz monitor hides a lot of little known goodies for emulator developers and emulator fans such as (A) brighter strobing and (B) better color quality without the checkerboard-pixel-texture problem (assuming you use the 60fps@180Hz ON:OFF:OFF software BFI cadence with strobe mode, or 60fps@240Hz ON:ON:OFF:OFF software BFI cadence with non-strobed mode).

Not all emulators support custom BFI cadence necessary for improved-quality blur reduction for emulator use. When done properly, 240Hz monitor blur-reduction emulator quality can look much better than VG248QE LightBoost + 60Hz software BFI.

Recommendations For 240Hz BFI For Emulator Use

For BFI on 240Hz emulator use (full 8-bit quality, no fine-checkerboard inversion artifact), Chief Blur Buster recommends:

  1. NTSC: Native 180Hz refresh rate hardware-strobed at 180Hz, combined with software BFI using ON:OFF:OFF cadence for 60Hz emulators.
    Recommended monitor: BenQ Zowie XL2546, because it supports 180Hz *and* has voltage-boosted strobe brightness that the XL2540 does not have.
    .
  2. NTSC: Native 240Hz refresh non-strobed, combined with software BFI using ON:ON:OFF:OFF cadence for 60Hz emulator.
    Recommended monitor: Any bright 240Hz monitor of any brand.
    .
  3. PAL: Native 150Hz refresh rate hardware-strobed at 150Hz, combiend with software BFI using ON:OFF:OFF cadence for 50Hz
    Recommended monitor: BenQ Zowie XL2546, because it supports 150Hz *and* has voltage-boosted strobe brightness that the XL2540 does not have.
    .
  4. PAL: Native 200Hz refresh non-strobed, combined with software BFI using ON:ON:OFF:OFF cadence for 50Hz emulator.
    Recommended monitor: BenQ Zowie XL2540, XL2546, XL2740 -- because those support 200Hz custom refresh rate
For custom 150Hz or 180Hz or 200Hz refresh rates, create the Custom Resolution manually first (e.g. NVIDIA Control Panel, or ToastyX CRU, etc).

Recommendations For Emulator Developers Adding BFI Support For 150Hz, 180Hz, 200Hz and 240Hz

I strongly recommend emulator developers to add support for custom BFI cadences, as some cadences can eliminate software BFI problems (full 8-bit color quality, no inversion artifact, no image retention).

-- Like "1,0,0" and "1,1,0" (for 3-refresh-per-refresh, e.g. 50fps@150Hz or 60fps@180Hz)
-- Like "1,1,0,0" and "1,0,0,0" and "1,1,1,0" (for 4-refresh-per-refresh, e.g. 50fps@200Hz or 60fps@240Hz).
-- Or percentages to do phosphor-fade-emulated BFI sequences, such as "100%,25%,5%,0%" for 60fps-on-240Hz.

As a comma-separated command line BFI argument. Don't forget to apply gamma correction to the BFI fades (2.4) so they correspond more. For ease, perhaps accomplished via a user-friendly slider in a Settings screen.

Emulator users would be able to thusly choose BFI cadences that can do the following:

(A) Avoid inversion artifacts and improve color by choosing a cadence that pairs up 6-bit TN refresh cycles (even/odd); or
(B) Adjust brightness-vs-blur-reduction tradeoff, via adjusting the BFI ON:OFF ratio; or
(C) Simulate phosphor fade at a coarse granularity, via consecutively-faded refresh cycles
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6265
Joined: 05 Dec 2013, 15:44

Re: Will future strobing tech be suitable for retro gaming?

Postby Chief Blur Buster » 11 May 2019, 20:39

Long Term Future: Retina Refresh Rate makes Rolling Scan Emulation Easier

(Don't worry about this complexity just yet, just become prepared to program this in future).

We've got retina spatially that makes HLSL style filters good. Now, in the future, we will need to emulate a CRT temporally. This becomes possible to human-eye error margins with a retina refresh rate. Retina refresh rates allows you to emulate temporals very accurately, such as using successive adjacent fine-granularity refresh cycles (of retina refresh rate) to emulate CRT fade. You just need sufficient retina temporal resolution to pull it off.

Sheer number of refresh cycles per emulator refresh can help emulate temporal behaviour of CRT more accurately
Long term, retina refresh rates (1000Hz+) can be used to improve emulation by doing software-based rolling scans (even with blended edges) to accurately emulate CRT scanning at the millisecond level. Read more at Blur Busters Law: The Amazing Journey To Future 1000Hz Displays as 1000Hz gaming monitors are expected to arrive by ~2030. Software-based rolling scan actually works in experiments on prototype displays!

This can already be (coarsely) done at 240Hz
In the shorter term, it's possible to begin practicing on a 240Hz monitor to do a software-based rolling scan. Software based rolling scan emulation even already be vert coarsely practiced at 240Hz using quarter-screen-height blended-edge.

For developers to design a coarse rolling scan algorithm for 60fps@240Hz, you need to display the top quarter in refresh cycle #1 (rest of frame black), 2nd quarter in refresh cycle #2, 3rd quarter in refresh cycle #3, 4th quarter in refresh cycle #4. But you cannot do sharp-edge at this granularity, it shows up as a tearing artifact.

To mostly fix such tearing artifact, you need to do alpha-blended rolling scan edges so that adjacent refresh cycles blended edges of rolling scan over lap each other. Blending blurred edge (gradient to black) 1/8th screen height at top/bottom edges of 1/4th screen height slice for 60fps@240Hz.

CRITICAL TIP: gamma-correction in the blended edges is critical to make sure average luminance for all pixels is properly balanced in the refresh cycles that overlap the blended edges.

That said, 240Hz is not yet retina refresh rate yet
However, this is not yet truly retina-quality emulation of CRT scan temporally -- You need at least 8-refresh-per-refresh or more, so we will have to wait until approximately 480Hz before software-based rolling-scan begins to look good for 60Hz emulation. Retina refresh rates (1000Hz+) will look even better (see earlier post) for temporally emulating CRT scanning behaviour via software-emulated rolling scan window.

And if you haven't seen it yet, I strongly recommend you read Blur Busters Law: The Amazing Journey To Future 1000Hz Displays to understand how 240Hz is woefully short of being a retina refresh rate -- View this on a desktop monitor, so you can properly see the animations. This will help you much better understand the temporal behaviour of displays.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6265
Joined: 05 Dec 2013, 15:44

Re: Will future strobing tech be suitable for retro gaming?

Postby Patrick » 12 May 2019, 12:34

Chief Blur Buster wrote:a lot of awesome info


Thanks for the info! I'm learning a lot.

Am I correct that hardware based strobing will always add some amount of input lag compared to BFI?

This creates a bit of a conundrum. With hardware-based strobing you get a much brighter image with better colors (using one of the newer strobing technologies), but you get additional input lag. With software-based BFI, you get no additional input lag, but you also get much worse colors and brightness. Many retro games (platformers, fighting games, shooters) are particularly sensitive to input lag, so next-frame response (less than 16ms input lag) is a requirement for many retro gamers.

The solution (for emulators running 60fps content) seems to be to crank out as much brightness as possible on the brightest possible display while using BFI @ 120 Hz.

If we go higher than 120 Hz and use a custom BFI cadence then we need even more brightness to compensate. Example, BFI @ 240 Hz with ON:OFF:OFF:OFF to get 60fps results in a 75% reduction in brightness, compared to BFI @ 120 Hz which results in a 50% reduction.

Hopefully, 120 Hz will continue to be a thing for a long time, because as higher refresh rates are pushed by manufacturers, we'll need ever greater brightness levels to compensate when doing software BFI to get to 60fps.

It looks to me like the problem of motion blur has essentially been solved by software BFI and high refresh rates, but max brightness is still a significant hurdle when it comes to doing proper CRT emulation (scanlines + phosphor mask + BFI). Doing this results in an almost 90% reduction in brightness, so you need 1000-2000 nits to do CRT emulation while matching the brightness of a CRT. That's doing BFI @ 120Hz; if you want to do BFI @ 240Hz for the additional benefits (less input lag, no inversion artifacts, full 8-bit color), then you need more like 2000-3000 nits! :D

One of the new 4k HDR local-dimmed displays would be bright enough to do the sort of CRT emulation I'm talking about, but at $2000 for a desktop monitor... :| Also, is that brightness available on demand, or does an application have to support HDR to utilize that extra brightness? Getting HDR support in emulators might be yet another hurdle...

I wish I had the knowledge to attempt some kind of LCD hack like the one you described. Learning about all this has got me interested in learning even more about LCD hardware and electronics in general. Any suggested resources for a total n00b? I think I may ultimately have to rig up a custom LCD, or pay someone to do it.
Patrick
 
Posts: 10
Joined: 13 Mar 2019, 21:38

Next

Return to Eliminating Motion Blur — LightBoost / ULMB / ELMB / DyAc

Who is online

Users browsing this forum: No registered users and 6 guests