Page 3 of 4
Re: G-SYNC Pulsar: Blur Reduction or Black Frame Injection in cojunction with Variable Refresh Rate or Adaptive Sync
Posted: 14 Jan 2024, 17:50
by Chief Blur Buster
Hybred wrote: ↑11 Jan 2024, 09:28
Can backlight strobing & things like g-sync pulsar work on MiniLED LCDs? Or would the dimming zones cause incompatibility.
In theory, no.
In reality, yes it is a problem.
Not all pixels refresh at the same time, so it is a timing precision problem, since not all pixels refresh at the same time, and display manufacturers didn't really consider perfect 1:1 sync between LCD scanout and FALD scanout (synchronizing two scanouts perfectly).
Local dimming electronics (FALD pulse timing controllers) are too slow or inflexible-programmability to be pefectly in sync with LCD scanout (
www.blurbusters.com/scanout). Local dimming often has motion artifacts like 1-frame lagbehind effects, and local dimming also adds input latency, so there's some difficulties.
These are solvable, but... the race to bottom for cheaper 3-figure-priced local dimming has made it hard to add luxury features to MiniLED controllers... The situation could very well change, but 100% of all MiniLED controllers I tested, aren't able to have enough timing precision to do strobing at high quality (yet).
Related topic:
Why Do Lower Hz Have Clearer Strobing?.
See the engineering diagrams.
Except add the engineering complexity dimension of FALD, with precisely timed scanning backlight behavior. FALD could become a perfect scanning backlight, if the timing precision is re-engineered with new FALD controller chips. But such special FALD controller chips doesn't yet exist on the consumer market. Ouch.
Blur Busters used to be named scanningbacklight[.]com in year 2012 when we discovered engineering problems of scanning backlights at
www.blurbusters.com/faq/scanningbacklight ... Today, FALD means near-perfect scanning backlights are theoretically possible, but we need better timing-controller chips for FALDs first. You wouldn't need Large Vertical Total tricks anymore, since you can flash a row out-of-phase of LCD scanout, and have low crosstalk, and delightfully good "VSYNC OFF" latency mechanics (panel:scanout sync, TOP=CENTER=BOTTOM preserved), and be a strobe dream for LCDs. But we're not quite there yet.
Re: G-SYNC Pulsar: Blur Reduction or Black Frame Injection in cojunction with Variable Refresh Rate or Adaptive Sync
Posted: 18 Jan 2024, 12:22
by Supermodel_Evelynn
If I could get a monitor even a 1080P that does proper VRR and BFI at the same time and has good brightness and CRT like 60hz clarity
I would pay $1000 USD for a monitor like that.
I am just hoping Chief can pull it off with the upcoming Blur Buster 2.0 monitors.
He said BenQ manages to do it with some sort of trick overclocking the LED to flash at 1000 nits.
Re: G-SYNC Pulsar: Blur Reduction or Black Frame Injection in cojunction with Variable Refresh Rate or Adaptive Sync
Posted: 18 Jan 2024, 21:04
by Chief Blur Buster
Supermodel_Evelynn wrote: ↑18 Jan 2024, 12:22
He said BenQ manages to do it with some sort of trick overclocking the LED to flash at 1000 nits.
Essentially yes, a brief overvoltage boost (a metaphorical overclock).
Talbot-Plateau Theorem requires that. 1000nit flash 25% of the time averages out to 250 nits. BenQ does quite a large voltage-boost to their LEDs to keep strobing bright.
knypol wrote: ↑13 Jan 2024, 15:37
Is BFI better than ULMB or Dyac? Can it work with VRR? From what i understand it also divides in half current refresh? GSync Pulsar in short is ULMB + VRR at the same time?
Neither is better, it's Right Tool for Right Job.
- Refresh cycle level BFI flickers less but reduces less blur (due to rolling or semi-rolling nature, even if not sub-refresh).
- Backlight strobe flickers more (global flash) but reduces more blur.
This is because motion blur is proportional to frame visibility time (frametime on sample and hold, pulsetime on strobed).
That's why you've got the 2 methods of blur busting
- Brief frames via brute framerate (non-BFI/strobe method)
- Brief frames via flash briefly (BFI/strobe method)
And it's easier to flash a backlight for 1ms, than to display a panel refresh cycle for 1ms (that'd require 1000fps 1000Hz).
Re: G-SYNC Pulsar: Blur Reduction or Black Frame Injection in cojunction with Variable Refresh Rate or Adaptive Sync
Posted: 05 Mar 2024, 08:21
by Vad1us
Is NVIDIA Pulsar out? Are there any reviews? I saw monitors that should support it are selling. I am interested in this technology, but cant find any reviews.
Re: G-SYNC Pulsar: Blur Reduction or Black Frame Injection in cojunction with Variable Refresh Rate or Adaptive Sync
Posted: 07 Jul 2024, 23:57
by phzera
Any news ? Will the PG248QP be supported?
Re: G-SYNC Pulsar: Blur Reduction or Black Frame Injection in cojunction with Variable Refresh Rate or Adaptive Sync
Posted: 21 Aug 2024, 02:51
by Discorz
Some news on the GSYNC module and Pulsar. From now on, Nvidia G-SYNC feature set will be integrated into MediaTek scalers.
Justin Walker from NVIDIA wrote:Integrating G-SYNC into MediaTek scalers eliminates the need for a separate G-SYNC module, streamlining the production process and reducing costs.
https://tftcentral.co.uk/news/nvidia-ab ... er-scalers
https://videocardz.com/newz/nvidia-g-sy ... ed-modules
https://www.computerbase.de/2024-08/koo ... rfluessig/
https://www.theverge.com/2024/8/20/2422 ... artnership
Asus PG27AQNR, Acer XB273U F5, AOC AG276QSG2, the first three monitors with MediaTek scaler and full G-SYNC feature set, including Pulsar

Re: G-SYNC Pulsar: Blur Reduction or Black Frame Injection in cojunction with Variable Refresh Rate or Adaptive Sync
Posted: 21 Aug 2024, 06:45
by Baron of Sun
Will these GSync Pulsar monitors only work with devices with an NVidia graphics card or with any device? I’m interested in that technology because of the increase in motion clarity, but I’m wondering if that can be used with the Switch, PS5, too? Does anyone know?
Re: G-SYNC Pulsar: Blur Reduction or Black Frame Injection in cojunction with Variable Refresh Rate or Adaptive Sync
Posted: 21 Aug 2024, 08:04
by jorimt
Discorz wrote: ↑21 Aug 2024, 02:51
Some news on the GSYNC module and Pulsar. From now on, Nvidia G-SYNC feature set will be integrated into MediaTek scalers.
Justin Walker from NVIDIA wrote:Integrating G-SYNC into MediaTek scalers eliminates the need for a separate G-SYNC module, streamlining the production process and reducing costs.
Good news if it retains feature/performance parity and truly reduces costs.
It may even allow a resurgence of more native G-SYNC monitors over software-level G-SYNC Compatible monitors, the latter of which have flooded the market since their release
(to mixed effect).
And who knows, this could even ultimately pave the way for hardware-level G-SYNC being integrated into some TV scalers down the road? *fingers-crossed*
Re: G-SYNC Pulsar: Blur Reduction or Black Frame Injection in cojunction with Variable Refresh Rate or Adaptive Sync
Posted: 28 Aug 2024, 20:13
by GFresha
Baron of Sun wrote: ↑21 Aug 2024, 06:45
Will these GSync Pulsar monitors only work with devices with an NVidia graphics card or with any device? I’m interested in that technology because of the increase in motion clarity, but I’m wondering if that can be used with the Switch, PS5, too? Does anyone know?
Based on reviews I read it seems yes only nvidia cards will be able to use stuff like Pulsar, reflex analyzer, ULMB2 etc..
Re: G-SYNC Pulsar: Blur Reduction or Black Frame Injection in cojunction with Variable Refresh Rate or Adaptive Sync
Posted: 29 Aug 2024, 01:48
by Discorz
What reviews? Pulsar is not out yet.
Reflex Analyzer, variable overdrive and other Nvidia features, except ULMB worked fine on my AW2521H with old RX 580. ULMB was an issue because it required VRR to be disabled first. And AMD doesn't have the option to fully disable VRR like Nvidia. I believe same holds up for ULMB 2. But later on someone found a CRU trick that can supposedly disable it. I didn't try the trick at the time tho.
VRR most likely doesn't need to be disabled with Pulsar anyways, so I don't see why it shouldn't work. And monitor scaler is doing the work not the gpu. But there is a possibility it won't if for some reason everything works differently. We'll have to see I guess.