When you look at Rtings measurements of BFI on let's say current OLEDs, the BFI signal is almost a perfect square wave.
What would happen in terms of the perception of motion if the BFI signal would have a slower rise / fall time (more a Sinus wave than a square wave)
I'd really like to understand the maths behind that and what kind of impact that has on the human eye.
How does the wave form of the BFI signal impact motion blur reduction?
-
Baron of Sun
- Posts: 19
- Joined: 24 Jul 2024, 13:37
Re: How does the wave form of the BFI signal impact motion blur reduction?
I'm surprised that Chief hasn't shown interest in this topic, as this is a interesting question!Baron of Sun wrote: ↑11 Aug 2025, 12:13When you look at Rtings measurements of BFI on let's say current OLEDs, the BFI signal is almost a perfect square wave.
What would happen in terms of the perception of motion if the BFI signal would have a slower rise / fall time (more a Sinus wave than a square wave)
I'd really like to understand the maths behind that and what kind of impact that has on the human eye.
The closest thing I've come across in terms of this subject is this topic: viewtopic.php?f=7&t=10050
evaluating xhci controller performance | audio latency discussion thread | "Why is LatencyMon not desirable to objectively measure DPC/ISR driver performance" | AM4 / AM5 system tuning considerations | latency-oriented HW considerations | “xhci hand-off” setting considerations | #1 tip for electricity-related topics | ESPORTS: Latency Perception, Temporal Ventriloquism & Horizon of Simultaneity
-
Baron of Sun
- Posts: 19
- Joined: 24 Jul 2024, 13:37
Re: How does the wave form of the BFI signal impact motion blur reduction?
Thanks for sharing this thread! Maybe its more comfortable for the eye if it is not a perfect square wave? This might be the reason why people say the CRT beam simulator is less flickery because it tries to simulate the decay time of CRT?kyube wrote: ↑19 Aug 2025, 13:08I'm surprised that Chief hasn't shown interest in this topic, as this is a interesting question!
The closest thing I've come across in terms of this subject is this topic: viewtopic.php?f=7&t=10050
Re: How does the wave form of the BFI signal impact motion blur reduction?
That is a bit of a difficult rabbit hole to go down, as eye strain is very subjective and difficult to extrapolate for a larger population.Baron of Sun wrote: ↑21 Aug 2025, 09:50Thanks for sharing this thread! Maybe its more comfortable for the eye if it is not a perfect square wave? This might be the reason why people say the CRT beam simulator is less flickery because it tries to simulate the decay time of CRT?
My general advice is to stay away from any kind of light flicker, whether it is PWM single-strobe (backlight strobing/ BFI) or PWM multi-strobe (prevalent in older LCD's & AMOLED devices)
Here's a thread which touches on this: viewtopic.php?t=6519
This thread: viewtopic.php?p=112963 might be interesting (rolling scan vs global scan backlight strobing)
There's also this waveform of the green phosphor in CRT's: https://youtu.be/Ya3c1Ni4B_U?si=ZwtfvfnqPrzrv3B-&t=918
There's also this video, which compares new high refresh rate LCD's & OLED's vs a CRT. There's a showcase of how the light flicker behaves on CRT's in this video too: https://www.bilibili.com/video/BV1Hfb9zCE39/
evaluating xhci controller performance | audio latency discussion thread | "Why is LatencyMon not desirable to objectively measure DPC/ISR driver performance" | AM4 / AM5 system tuning considerations | latency-oriented HW considerations | “xhci hand-off” setting considerations | #1 tip for electricity-related topics | ESPORTS: Latency Perception, Temporal Ventriloquism & Horizon of Simultaneity
