Is Local Dimming/Quantum Dots Really a Good Thing?

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
User avatar
jorimt
Posts: 912
Joined: 04 Nov 2016, 10:44

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by jorimt » 17 Dec 2017, 13:01

darzo wrote:I've already read the Dell monitor review. It's understandable but conveyed little to me in practice. Likewise I find those images virtually nonsensical.They remind me of Amazon reviews about how basically monitors emit light. Black screens with light all over them. See the same thing when I turn off my computer, make no connection to what appears on the screen when something is actually there because it's so different and normal looking.

I.e. is blooming actually something to be concerned about, something that would stand out and bother someone?
With a non-FALD monitor, edge-lit or otherwise, the black level is static; if you have the screen set to 80% brightness, for instance, a full black screen could be, say, 0.2 nits, and if you have it calibrated to a lower max brightness, the same black screen may have a static black level of 0.1 nits (for references, OLED panels have a true 0 nits black level, and the best plasma displays had somewhere around 0.003 nits). In this scenario, with mixed content, anything that is mastered for black will have that minimum nits level, regardless of what else is on screen, bright or dark.

With a FALD display, now you have the native black level of the panel based on the current max brightness of the display, plus the localized, selective dimming that allows lower than native black levels by not shining the backlight through specific parts of the LCD. Problem is, since the number of LEDs and the number of pixels in the LCD will never match up 1:1, the LED zones will always be larger than individual pixels, which means the more black on the screen and the less white on the screen, the more the active LED zones will bleed over into non-active zones to illuminate the bright object(s), partially revealing the panel's native black level and resulting in a visible contrast between the two, which is what we refer to as "blooming artifacts."

The level of blooming is dependent on the native contrast ratio of the given display, the amount of zones, the range between bright and dark in the given content, and the effectiveness of the dimming calculations.

So with a typical, well calibrated IPS panel equipped with an effective FALD system, quite simply put, due to the variances in tolerance, taste, and perspective from person to person, there isn't a one-size-fits-all answer on if they will "bother someone." The severity of the blooming artifacts are subjective; some will find them acceptable, some will find them unacceptable.

Since you can't seem to believe images depicting the artifact, you're best off seeing a FALD monitor in person before making a final decision on one when they're ultimately made available.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Display: Acer Predator XB271HU OS: Windows 10 Pro MB: ASUS ROG Maximus X Hero CPU: i7-8700k GPU: EVGA GTX 1080 Ti FTW3 RAM: 32GB G.SKILL TridentZ @3200MHz

darzo
Posts: 211
Joined: 12 Aug 2017, 12:26

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by darzo » 17 Dec 2017, 13:18

That's a better explanation. These delayed 4k, 144hz, hdr, 384-zone array-lit local dimming, quantum dot, 27-inch monitors should be right around the corner. I don't know how Nvidia has handled new gen card releases in the past with respect to the exact timing but on the 7th of January they might be announcing the new gaming cards as well. Am thinking of pairing the Acer model with a Volta gaming Titan. Will put my faith in them and return the monitor if necessary. The Chief Blur Buster has said that Nvidia works in a rather close partnership with the monitor companies so that's another reason to be optimistic about how things are implemented, in addition to the price and the brands.

So blooming is not something encountered in non-array lit panels?

User avatar
jorimt
Posts: 912
Joined: 04 Nov 2016, 10:44

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by jorimt » 17 Dec 2017, 14:23

darzo wrote:So blooming is not something encountered in non-array lit panels?
For non-emissive, LCD-type displays, correct. And edge-lighting with no local dimming is currently sufficient (and more cost effective) for the majority of non-HDR ("SDR") monitors.

But if you want a true HDR monitor, FALD is required to reach the contrast ratio necessary to accurately reproduce 1000 nits-mastered HDR content on LCD-type displays.

One thing they may start introducing once these FALD HDR monitors begin to mature (or perhaps already will), is the option to disable local dimming to allow use of the panel's native contrast ratio (and static black level) with SDR content instead. My 7 year old FALD TV has that option after all, so it isn't out of the question.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Display: Acer Predator XB271HU OS: Windows 10 Pro MB: ASUS ROG Maximus X Hero CPU: i7-8700k GPU: EVGA GTX 1080 Ti FTW3 RAM: 32GB G.SKILL TridentZ @3200MHz

BattleAxeVR
Posts: 44
Joined: 14 Dec 2017, 11:38

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by BattleAxeVR » 17 Dec 2017, 16:38

RealNC wrote:I meant latency vs non-HDR content. So you'd say dimming does need to buffer at least one frame?
In videogames like GTA V (which I worked on), there is something similar, namely the dynamic exposure affects the current frame based on some weighted sum of prior frames (to smooth out high frequency variance which would be distracting).

So a TV manufacturer could use the results of the previous frame's brightness to modulate the LEDs which at 120hz or so should work out fine for edge-lit, and no additional lag to wait for the entire frame to be buffered from the HDMI / DP input before scanout required.

Not sure about FALD, it could show a tiny bit of ghosting but LEDs are much bigger than pixel sizes (understatement) so it would also likely not be noticeable.

There are tons of techniques in gaming that re-use prior frames' data or metadata in the current frame, to various effect. If you use prior frames lighting calculations, you can sometimes see a laggy trail of the lighting catching up to the current state.

For dynamic HDR modes like HDR10+ or Dolby Vision you could also in theory not have to wait for the entire chroma planes to be buffered prior to scan out, thanks to explicit metadata, but virtually ALL TVs now have dynamic backlight dimming of some sort for SDR content, which must use similar min / max / avg luminance accumulator chips. That calculation can be extremely fast but the main lag issue coming from it is just the initial buffering. A lot of times in games they will avoid additional full-screen passes for things like Post-FX by combining several effects into one mega compute shader. I'm sure TV chips use similar types of optimizations in hardware.

BattleAxeVR
Posts: 44
Joined: 14 Dec 2017, 11:38

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by BattleAxeVR » 17 Dec 2017, 16:47

darzo wrote:So blooming is not something encountered in non-array lit panels?
What's really funny is that for years "HDR" in videogames was basically just a blooming effect and a tone mapping curve, so now with FALD TVs there is blooming in hardware which does literally the same thing.

I don't find it objectionable, I'd much rather even a mediocre FALD implementation any day of the week. Blooming might be annoying around super bright objects in the scene, but at least the black level of the entire frame isn't wrecked by one bright object.

I'm not an expert on FALD implementations but I seem to recall that it also helps with backlight uniformity, since you can tune the relative brightness of each zone independently to even out a grayscale image. Also, FALD is dirt cheap now, tons of TVs have it, like Vizios (which makes some of the lower lag gamer TVs now), and HiSense. There is really no reason not to get it. The way I see edge-lit is this: the "blooming" is everywhere, instead of localized around bright objects, which in any case looks fairly natural and what cameras and lenses get. Anyone who owns a Rift or Vive will know what I mean. (although I do agree that lens glare is annoying, and looking forward to next year's improvements in this area).

BattleAxeVR
Posts: 44
Joined: 14 Dec 2017, 11:38

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by BattleAxeVR » 17 Dec 2017, 16:53

darzo wrote:I.e. is blooming actually something to be concerned about, something that would stand out and bother someone?
I'm a big fan of OLED, but some pro TV reviews have shown that the higher end FALD sets achieve a much bigger colour volume so unless you're watching a lot of dark sci-fi and horror in a pitch-black theater room, then a FALD with much higher nits will look "better" to most people, more "pop".

Agree in an ideal world we'd all have ideal displays, but I would really worry spending a ton of cash on an OLED and end up with retention.

Then again, I used to have a plasma and was careful to use the orbiter and rotate the start bar and icons around my desktop and never got any burn in. So it should be OK to game on an OLED if you're careful.

User avatar
sharknice
Posts: 283
Joined: 23 Dec 2013, 17:16
Location: Minnesota
Contact:

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by sharknice » 18 Dec 2017, 01:25

BattleAxeVR wrote:
darzo wrote:I.e. is blooming actually something to be concerned about, something that would stand out and bother someone?
I'm a big fan of OLED, but some pro TV reviews have shown that the higher end FALD sets achieve a much bigger colour volume so unless you're watching a lot of dark sci-fi and horror in a pitch-black theater room, then a FALD with much higher nits will look "better" to most people, more "pop".

Agree in an ideal world we'd all have ideal displays, but I would really worry spending a ton of cash on an OLED and end up with retention.

Then again, I used to have a plasma and was careful to use the orbiter and rotate the start bar and icons around my desktop and never got any burn in. So it should be OK to game on an OLED if you're careful.
Unless you're in a brightly lit store or room, or are watching TV with the curtains open mid-day I think OLED looks better by a large margin. True blacks add much, much more than a brighter image. Even with lights on I still think true blacks look better, just not as much better.

open
Posts: 223
Joined: 02 Jul 2017, 20:46

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by open » 18 Dec 2017, 12:04

jorimt wrote:As a side note, there are two entirely different quantum dot technologies. The first you already mentioned, the second is an emissive alternative to OLED, but has no set date for production (e.g. it is currently in the theoretical/conceptual phase for practical mainstream mass production in consumer displays).
I'm still waiting on the emissive qd tech where they emit colors. Really I just want something that compares to the DLP tv I used 5 or so years ago. It was 120hz max but it had zero percievable pixel response and was insanely bright. Spinning the camera in games was clearer than spinning around in real life. God that tv was insane.

User avatar
jorimt
Posts: 912
Joined: 04 Nov 2016, 10:44

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by jorimt » 18 Dec 2017, 12:28

@open, emissive quantum dot tech is likely a long wait yet.

But unfortunately, anything emissive is subject to retention due to uneven aging. This isn't an issue with balanced media viewing (I own an LG C7 OLED solely for movies/TV use, and haven't had a hint of permanent retention issues), but it's definitely something they'll struggle to solve completely for dedicated, UI-heavy gaming monitors (let alone prolonged desktop and web use).
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Display: Acer Predator XB271HU OS: Windows 10 Pro MB: ASUS ROG Maximus X Hero CPU: i7-8700k GPU: EVGA GTX 1080 Ti FTW3 RAM: 32GB G.SKILL TridentZ @3200MHz

BattleAxeVR
Posts: 44
Joined: 14 Dec 2017, 11:38

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by BattleAxeVR » 18 Dec 2017, 14:59

sharknice wrote:Unless you're in a brightly lit store or room, or are watching TV with the curtains open mid-day I think OLED looks better by a large margin. True blacks add much, much more than a brighter image. Even with lights on I still think true blacks look better, just not as much better.
I'd much rather own an OLED since I watch in the dark mostly, but if I were buying a daytime TV for the wife n kids, I'd save some cash and get a dirt cheap FALD LCD with Dolby Vision for a fraction of the price and a measurably bigger colour volume which makes HDR pop like crazy.

That said, the newer OLEDs are getting better white levels too. OTOH LCDs get bigger and bigger and cheaper and cheaper every year. We'll see which tech wins out, I hope it's OLED but regardless it's the consumer who wins when there's competition.

Post Reply