Hi all,
The RetroTINK-4K is able to increase the brightness of the image when BFI is on, which sounds like a good thing, but is it?
Surely the flicker comes from going from a colourful, bright screen to a black screen and then back again very quickly. Logically, surely the brighter the actual image the worse the flicker will be because you're going from one extreme to the other. Shouldn't we want the image screen to be as dim as possible to get the least amount of difference between it and the black screen?
Will the increase in brightness actually increase flicker?
Or am I dead wrong and it actually helps...?
BFI + HDR injection - is it desirable...? [Answer: Yes + your choice + adjustable]
-
- Posts: 8
- Joined: 25 Aug 2020, 05:34
- Chief Blur Buster
- Site Admin
- Posts: 11775
- Joined: 05 Dec 2013, 15:44
- Location: Toronto / Hamilton, Ontario, Canada
- Contact:
Re: BFI + HDR injection - is it desirable...? [Answer: Yes + your choice + adjustable]
You're forgetting basic science & physics.Nintendude wrote: ↑06 Dec 2023, 11:20Hi all,
The RetroTINK-4K is able to increase the brightness of the image when BFI is on, which sounds like a good thing, but is it?
Surely the flicker comes from going from a colourful, bright screen to a black screen and then back again very quickly. Logically, surely the brighter the actual image the worse the flicker will be because you're going from one extreme to the other. Shouldn't we want the image screen to be as dim as possible to get the least amount of difference between it and the black screen?
Will the increase in brightness actually increase flicker?
Or am I dead wrong and it actually helps...?
1. Average nits.
2. CRT electron beam dot goes over 10,000 nits briefly, did you know?
3. It's adjustable. You can brighten/dim as much as you like!
240Hz BFI reduces brigtness of 60Hz material by 755. Your 200nit SDR content goes only 50nits. You actually want HDR boost.
120Hz BFI = 60/120th original blur ~= 50% blur reduction ~= 50% dimming
240Hz BFI = 60/240th original blur ~= 75% blur reduction ~= 75% dimming
360Hz BFI = 60/360th original blur ~= 83% blur reduction ~= 83% dimming
480Hz BFI = 60/480th original blur ~= 87% blur reduction ~= 87% dimming
True CRT = requires >1000Hz BFI. But even 240Hz still looks good enough for Sonic Hedgehog. Lower resolution limits visibility of motion resolution loss. 480 Hz OLED (2025) will be fantastic, 1000Hz even better (2027-2030).
If you want to reduce more blur, you get very dark, and therefore you want HDR boost to re-brighten something so faint.
On a 240Hz display, view TestUFO Variable Persistence BFI for 240Hz displays, Brightness Equalizer Disabled (produces 60fps if you view at 240Hz). You can see that more motion blur reduction = much dimmer. Also I am in over 30 peer reviewed papers, see www.blurbusters.com/area51 -- there's lots of reading there for you;
Do you now understand that pulsetime=blur? Briefer pulse = less blur = but also dimmer.
I do teach classes on this.
Now go do your homework.
/micdrop
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter
Forum Rules wrote: 1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
2. Please report rule violations If you see a post that violates forum rules, then report the post.
3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!
-
- Posts: 8
- Joined: 25 Aug 2020, 05:34
Re: BFI + HDR injection - is it desirable...? [Answer: Yes + your choice + adjustable]
Hey,
Let me pick up that microphone for you...
Your response makes me think that you felt I was attacking you in some way - I can assure you I wasn't. I know that you're an expert in this field and that your work with Mike Chi on the RT4K is highly valued.
I'm no scientist and, whilst a lot of what is said here does go over my head, I do understand the BFI relationship between blur reduction and dimming. I have an LGCX and use BFI High for 60FPS games so I understand the hit in brightness, believe me!
That's why I was so excited about the prospect of what the RT4K can do; this HDR feature would be a game changer for BFI.
All I wanted to know is if the flicker would be worse due to the larger difference between bright screen / black screen. If your response answers this then I apologise for not understanding but the tone felt a little patronising and defensive for what I hoped was an honest question.
Anyway, I still respect the great work you do, it really does make a difference.
Let me pick up that microphone for you...
Your response makes me think that you felt I was attacking you in some way - I can assure you I wasn't. I know that you're an expert in this field and that your work with Mike Chi on the RT4K is highly valued.
I'm no scientist and, whilst a lot of what is said here does go over my head, I do understand the BFI relationship between blur reduction and dimming. I have an LGCX and use BFI High for 60FPS games so I understand the hit in brightness, believe me!
That's why I was so excited about the prospect of what the RT4K can do; this HDR feature would be a game changer for BFI.
All I wanted to know is if the flicker would be worse due to the larger difference between bright screen / black screen. If your response answers this then I apologise for not understanding but the tone felt a little patronising and defensive for what I hoped was an honest question.
Anyway, I still respect the great work you do, it really does make a difference.
- Chief Blur Buster
- Site Admin
- Posts: 11775
- Joined: 05 Dec 2013, 15:44
- Location: Toronto / Hamilton, Ontario, Canada
- Contact:
Re: BFI + HDR injection - is it desirable...? [Answer: Yes + your choice + adjustable]
You're welcome!
Talbot-Plateau Law can be a [BLEEP] sometimes.
Now focussing ONLY on generic perceptual physics, you have a good point:
- It's true brighter flickers can be more visible (but only if you're already bright in the first place)
- It's true global flickers can be more visible than rolling flickers.
Retrotink 4K does not do rolling BFI but your LCD/OLED does rolls those refresh cycles out. So you've got a hybrid semi-global-rolling BFI. You can't do it in a sub-refresh fragment, but you do have a sweep-on then a sweep-off, as the visible-vs-black frame gets "painted" onto the screen. So it's a lot less harsh than global backlight strobe and BenQ's 1000-nit overvoltage-booster DyAc+ in XL2566K. Yes, Gaming monitors like XL2566 is strobing ~1000nits to average ~300 nits. (It temporarily overvolts the edgelight LEDs to about ~1000 nits to average 300 nits). Some people don't mind, but some people are bothered.
But in our situation, we're fighting against 50-nit BFI. Dimming a SDR-mode average-200nit OLED down to 50nit via 75%:25% (3 black, 1 visible) 240Hz BFI, really requires that HDR nits booster to undo that dimming. Yesterday's CRTs often averaged only 100-200 nits during bright full screen material anyway, due to how they slightly dimmed when everything was bright.
Talbot-Plateau Law can be a [BLEEP] sometimes.
Now focussing ONLY on generic perceptual physics, you have a good point:
- It's true brighter flickers can be more visible (but only if you're already bright in the first place)
- It's true global flickers can be more visible than rolling flickers.
Retrotink 4K does not do rolling BFI but your LCD/OLED does rolls those refresh cycles out. So you've got a hybrid semi-global-rolling BFI. You can't do it in a sub-refresh fragment, but you do have a sweep-on then a sweep-off, as the visible-vs-black frame gets "painted" onto the screen. So it's a lot less harsh than global backlight strobe and BenQ's 1000-nit overvoltage-booster DyAc+ in XL2566K. Yes, Gaming monitors like XL2566 is strobing ~1000nits to average ~300 nits. (It temporarily overvolts the edgelight LEDs to about ~1000 nits to average 300 nits). Some people don't mind, but some people are bothered.
But in our situation, we're fighting against 50-nit BFI. Dimming a SDR-mode average-200nit OLED down to 50nit via 75%:25% (3 black, 1 visible) 240Hz BFI, really requires that HDR nits booster to undo that dimming. Yesterday's CRTs often averaged only 100-200 nits during bright full screen material anyway, due to how they slightly dimmed when everything was bright.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter
Forum Rules wrote: 1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
2. Please report rule violations If you see a post that violates forum rules, then report the post.
3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!