Page 1 of 1

A question for the CHief about the CRT Beam SIm

Posted: 17 Sep 2025, 18:39
by blurfreeCRTGimp
As you know I have been enjoying using the shaderglass alpha to do a Phosphor fade BFI on my BenQ XL2720 LCD overclocked at 180 FPS. I was using it today, and thinking about something John Carmack had said during the 1st Oculus connect conference years ago.

He mentioned using sliced time warp for ultra low persistence displays with a kilohertz update rate, but more pertinent here he also mentioned using variable persistence on individual scanlines to get a simulated HDR out of a standard SDR panel. One line where there is something like the sun is shown at the full persistence of the panel while the rest of the scene is tone mapped down.

Our visual system adapts to screen brightness level. (like watching a TV in the dark with a dark scene and then suddenly a bright light appears, and its near blinding to your eyes.)

https://www.youtube.com/watch?v=gn8m5d74fk8 (start watching at 10:40)

I realized you could implement Carmack's idea in the CRT Beam simulator shader when I was noticing what happens when the shader loses sync. (You all know how it flickers horribly?) but between the flickering the screen appears brighter? Its not that its actually any brighter, its that the persistence is longer when the shader is inactive, so it seems brighter.

Long winded way of asking the chief if there is a you could leverage that as part of the shader? In the area where you want the "beam" and its supposed to be hyper bright before the fade could you run that at a longer persistence by having the shader off for a brief moment in a controlled way?



I'm sure i'm not asking or describing this right, I am not an engineer, but it seemed like a way you could leverage this shader better on a low nit panel (which is most of them.)

Re: A question for the CHief about the CRT Beam SIm

Posted: 18 Sep 2025, 23:15
by Chief Blur Buster
blurfreeCRTGimp wrote:
17 Sep 2025, 18:39
I realized you could implement Carmack's idea in the CRT Beam simulator shader when I was noticing what happens when the shader loses sync. (You all know how it flickers horribly?) but between the flickering the screen appears brighter? Its not that its actually any brighter, its that the persistence is longer when the shader is inactive, so it seems brighter.
Unfortunately that's not a free lunch.

Motion blur flickers worse simultaneously with brighter. So you have sudden increase/decrease in motion blur synchronized to flickering. During those flickers it's still following Talbot Plateau Theorem & still following Blur Busters Law -- which means you may have briefly double or triple motion blur for an instant during those bright flickers.

It's still laws of physics in a "brightness versus blur continuum". You just don't notice the flickering in motion blur (e.g. sudden increase in blur) during a brief brightness flicker.

So I cannot piggyback off that.

Just adjust "Brightness versus Blur" setting. Identical, very same, exact thing.

HDR brightness boosting is easier, but the curve is tough (e.g. tonemapping, pq2linear(), linear2pq(), etc, since I cannot use the standard gamma curve formula). It's very hard to predict what a HDR display is doing sometimes.