Is Local Dimming/Quantum Dots Really a Good Thing?

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
darzo
Posts: 211
Joined: 12 Aug 2017, 12:26

Is Local Dimming/Quantum Dots Really a Good Thing?

Post by darzo » 05 Dec 2017, 00:25

I was looking at 4k TVs yesterday and noticed something very curious. Both Samsung and Sony have released newer and better model TVs that switch back from local dimming to I presume some sort of edge lighting or whatever the proper term is. This is quite concerning to me as someone giving the coming $2000 4k 144hz gaming monitors consideration. This site has claimed that the biggest draw of these monitors is local dimming while TFT Central has claimed that local dimming accounts for why these monitors will be so expensive. Yet in two instances I see TVs switching away from local dimming, with seemingly little impact on price as well however you'd try to look at it.

In theory local dimming/array lit/"quantum" dots sounds great but perhaps in practice thus far it hasn't been that great. These 4k gaming monitors have been in development for a while, delayed by as long as half a year as well, and I presume they have taken their cues from TVs given the similar features. But if TV producers have found something particularly negative about local dimming to remove it from its newer models gaming monitors might just be lagging behind. The gaming monitors I refer to will, after all, be the first to offer 4k with HDR and local dimming at a higher refresh rate, something TVs have been doing for at least a year.

Is it possible we'll be getting screwed at a $2000 price tag? Is there something far less than ideal about local dimming that TV companies have found through experience that has been determined too late for computer monitor companies to reverse course?

User avatar
RealNC
Site Admin
Posts: 3071
Joined: 24 Dec 2013, 18:32
Contact:

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by RealNC » 05 Dec 2017, 01:00

These are two different things.

Can't say much about local dimming, but QD is a good thing. It improves the color quality of the panel (by feeding it with higher quality light output from the backlight) and has zero drawbacks. It lowers the cost of the monitor, if the monitor wanted to achieve that level of color quality to begin with. Without QD, you'd have to use a good, expensive backlight. With QD, you can use a cheap, low quality backlight and have the QD filter fix the light that comes out of it.
SteamGitHubStack OverflowTwitter
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

darzo
Posts: 211
Joined: 12 Aug 2017, 12:26

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by darzo » 05 Dec 2017, 02:38

http://www.tftcentral.co.uk/articles/hdr.htm

Read under Full Array Local Dimming. 384 zones corresponds to the 384 "quantum dots" the coming monitors will have and is what has been seen in TVs.

darzo
Posts: 211
Joined: 12 Aug 2017, 12:26

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by darzo » 13 Dec 2017, 11:39

No thoughts on TV models going away from local dimming and for how long these monitors have been delayed?

User avatar
jorimt
Posts: 1323
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by jorimt » 13 Dec 2017, 18:19

The downsides to full array local dimming are:

- Form factor (leds [and more of them when compared to an edge lit] are situated at the back, making the unit thicker and heavier)
- Varying levels of "bloom" artifacts (depending on the amount of LED zones and the display type; IPS panels will have more innate glow, for instance).
- Manufacturing costs (more difficult/costly to fabricate)
- Programming/software (more LED zones means more to program for at the firmware level in order to effectively adjust the dynamic, real-time dimming across hundreds of zones)

So the answer is, you're seeing less backlight local dimming (this has really always been the case; I have a FALD TV from 7 years back, so this isn't a new idea), and more edge lit local dimming, or plain old edge lit, because the latter is easier, cheaper, and makes the unit lighter and thinner.

Edge lit displays however ("local" dimming or no), usually can't come close to producing the contrast ratio necessary to reaching the full potential of 1000+ nits HDR that full array local dimming can with hundreds of zones.

As far as is known, the primary reason the upcoming FALD monitors you mention were delayed wasn't because of FALD itself, but because AU Optronics (the panel provider) had yet to mass produce the 4k/144Hz panels (a first in the mainstream monitor market) that will be featured in these models.

Last was heard, panel production for those monitors was to begin last month.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Display: Acer Predator XB271HU OS: Windows 10 Pro MB: ASUS ROG Maximus X Hero CPU: i7-8700k GPU: EVGA GTX 1080 Ti FTW3 RAM: 32GB G.SKILL TridentZ @3200MHz

darzo
Posts: 211
Joined: 12 Aug 2017, 12:26

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by darzo » 14 Dec 2017, 01:02

I find it hard to believe TV manufacturers are switching from local dimming because it makes their TVs thicker and is harder and more costly to implement. Higher quality that is attained isn't ordinarily moved back down from. It's common knowledge why the PC monitors were delayed. Not taking it for granted.

User avatar
RealNC
Site Admin
Posts: 3071
Joined: 24 Dec 2013, 18:32
Contact:

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by RealNC » 14 Dec 2017, 05:48

In theory, doesn't local dimming introduce at least a full frame of latency? Or would a display be able to start dimming even before it has the full frame data?
SteamGitHubStack OverflowTwitter
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
jorimt
Posts: 1323
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by jorimt » 14 Dec 2017, 11:33

darzo wrote:I find it hard to believe TV manufacturers are switching from local dimming because it makes their TVs thicker and is harder and more costly to implement. Higher quality that is attained isn't ordinarily moved back down from. It's common knowledge why the PC monitors were delayed. Not taking it for granted.
Again, FALD was a trend 7+ years ago (dozens of models, including very expensive "plasma alternatives" with massive amounts of zones), but was phased out as soon as manufacturers could start using edge lit in its place. This was primarily for form factor and cost concerns (the majority of consumers like their TVs thin [at least that's what manufacturer think], and they like them cheap; pure picture-quality, unfortunately, often comes second to all but us enthusiasts).

The only reason FALD has had a resurgence, is because it is required for full HDR reproduction on LCD-type technology; I highly doubt it would have returned if it weren't for HDR.

TFT Central reviewed a FALD monitor (pretty much what we'll be getting in high Hz form later next year), with some good material on it's inner workings:
http://www.tftcentral.co.uk/reviews/del ... 8q.htm#hdr
RealNC wrote:In theory, doesn't local dimming introduce at least a full frame of latency? Or would a display be able to start dimming even before it has the full frame data?
Interesting question, and I'm not 100% certain, but I don't think FALD typically results in additional frames of latency. You could take a look at a few FALD TV reviews on rtings.com and draw your own conclusions on the lag numbers though.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Display: Acer Predator XB271HU OS: Windows 10 Pro MB: ASUS ROG Maximus X Hero CPU: i7-8700k GPU: EVGA GTX 1080 Ti FTW3 RAM: 32GB G.SKILL TridentZ @3200MHz

BattleAxeVR
Posts: 44
Joined: 14 Dec 2017, 11:38

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by BattleAxeVR » 14 Dec 2017, 11:58

FALD is basically a necessity for decent HDR on LCDs, they just don't have the native contrast to do HDR justice using edge-lit (though there are some edge-lit HDR capable TVs now).

FALD + Quantum dots have nothing to do with one another. The individual LEDs in FALD, are huge, and quantum dots are very, very tiny and not emmissive. FALD is just another way of saying a TV with multiple LEDs for its backlight instead of a single one across the entire surface. Quantum dots get hit by the blue LEDs to generate red and green tuned to certain frequencies for wider colour gamut so it's a very good thing to have, indeed. However the use of one or many blue LEDs in the back-light is immaterial to how QD films are laid out. All that quantum dots need to emit light at their tuned frequency is to be hit by incoming light at a lower frequency.

E.g. if you want to use a blue quantum dot layer, you'll need to use ultra-violet LEDs most likely. (UV wavelength < blue wavelength). Of course you could just use a lower blue wavelength too. The reason they use blue backlights for QD films is because they only need two films to get the red and green, since both red and green have higher wavelengths than blue. By precisely tuning the density of quantum dots of the films, you can get a D65 white point coming out the other side, using only blue LEDs (or lasers) to stimulate the quantum dots.

There is no reason why FALD would have any more latency than edge-lit, as the entire frame must be present inside the buffer prior to calculating the overall frame's luminance for dynamic (local or global) dimming. The processing time for such calculations is trivial, and won't add anywhere near a full frame of latency if it's done competently.

In theory, it's possible for an LCD TV to start scanning out when fed with RGB signals line-by-line, and I assume many "gamer" monitors already do this to get sub-frame latency levels, but when you feed an HDR image over HDMI 2.0a, you need to pass it via YCbCr 4:2:2 at best, which is a planar format instead of a packed format, i.e. the chroma planes are located after the luma plane in each frame.

Therefore, the lowest latency you could possibly get with an HDR10 signal, at 4K60 at least, would be roughly 50% of 16ms using 4:2:2 and 66% of 16ms using 4:2:0 (e.g. Dolby Vision uses 4:2:0 exclusively AFAICT).

BattleAxeVR
Posts: 44
Joined: 14 Dec 2017, 11:38

Re: Is Local Dimming/Quantum Dots Really a Good Thing?

Post by BattleAxeVR » 14 Dec 2017, 12:07

Another good thing about local dimming, if it hasn't already been mentioned (probably has), is that low motion blur / persistence modes can do rolling scan from the top to bottom of each row of LEDs in the backlight.

Post Reply