How does the wave form of the BFI signal impact motion blur reduction?

Ask about motion blur reduction in gaming monitors. Includes ULMB (Ultra Low Motion Blur), NVIDIA LightBoost, ASUS ELMB, BenQ/Zowie DyAc, ToastyX, black frame insertion (BFI), and now framerate-based motion blur reduction (framegen / LSS / etc).
Post Reply
Baron of Sun
Posts: 19
Joined: 24 Jul 2024, 13:37

How does the wave form of the BFI signal impact motion blur reduction?

Post by Baron of Sun » 11 Aug 2025, 12:13

When you look at Rtings measurements of BFI on let's say current OLEDs, the BFI signal is almost a perfect square wave.

What would happen in terms of the perception of motion if the BFI signal would have a slower rise / fall time (more a Sinus wave than a square wave)

I'd really like to understand the maths behind that and what kind of impact that has on the human eye.

User avatar
kyube
Posts: 562
Joined: 29 Jan 2018, 12:03

Re: How does the wave form of the BFI signal impact motion blur reduction?

Post by kyube » 19 Aug 2025, 13:08

Baron of Sun wrote:
11 Aug 2025, 12:13
When you look at Rtings measurements of BFI on let's say current OLEDs, the BFI signal is almost a perfect square wave.

What would happen in terms of the perception of motion if the BFI signal would have a slower rise / fall time (more a Sinus wave than a square wave)

I'd really like to understand the maths behind that and what kind of impact that has on the human eye.
I'm surprised that Chief hasn't shown interest in this topic, as this is a interesting question!
The closest thing I've come across in terms of this subject is this topic: viewtopic.php?f=7&t=10050

Baron of Sun
Posts: 19
Joined: 24 Jul 2024, 13:37

Re: How does the wave form of the BFI signal impact motion blur reduction?

Post by Baron of Sun » 21 Aug 2025, 09:50

kyube wrote:
19 Aug 2025, 13:08
I'm surprised that Chief hasn't shown interest in this topic, as this is a interesting question!
The closest thing I've come across in terms of this subject is this topic: viewtopic.php?f=7&t=10050
Thanks for sharing this thread! Maybe its more comfortable for the eye if it is not a perfect square wave? This might be the reason why people say the CRT beam simulator is less flickery because it tries to simulate the decay time of CRT?

User avatar
kyube
Posts: 562
Joined: 29 Jan 2018, 12:03

Re: How does the wave form of the BFI signal impact motion blur reduction?

Post by kyube » 21 Aug 2025, 10:20

Baron of Sun wrote:
21 Aug 2025, 09:50
Thanks for sharing this thread! Maybe its more comfortable for the eye if it is not a perfect square wave? This might be the reason why people say the CRT beam simulator is less flickery because it tries to simulate the decay time of CRT?
That is a bit of a difficult rabbit hole to go down, as eye strain is very subjective and difficult to extrapolate for a larger population.
My general advice is to stay away from any kind of light flicker, whether it is PWM single-strobe (backlight strobing/ BFI) or PWM multi-strobe (prevalent in older LCD's & AMOLED devices)
Here's a thread which touches on this: viewtopic.php?t=6519

This thread: viewtopic.php?p=112963 might be interesting (rolling scan vs global scan backlight strobing)

There's also this waveform of the green phosphor in CRT's: https://youtu.be/Ya3c1Ni4B_U?si=ZwtfvfnqPrzrv3B-&t=918

There's also this video, which compares new high refresh rate LCD's & OLED's vs a CRT. There's a showcase of how the light flicker behaves on CRT's in this video too: https://www.bilibili.com/video/BV1Hfb9zCE39/

Post Reply