Page 18 of 25

Re: Laser projectors general? [zero lag & zero blur!!!]

Posted: 24 Jan 2016, 00:04
by RLBURNSIDE
flood wrote:now if they can come up with something like low-latency tone-mapping so that hdr content can be displayed reasonably on displays with more limited luminance... that would be interesting.
Have you seen HDR video in person? Side by side with SDR content next to it? It's like night and day, not at all a buzzword this is the real deal.

What's really interesting is Technicolor's research into an algorithm for TVs to accept SDR content and expand it intelligently using histogram-based and rules-based educated guesses to show it in HDR on an HDR TV. Forget your old SDR TV set, it's toast in the era of UHD Bluray which has mandatory HDR10 encoding. If you're happy with SDR you can always stick to your 1080p FHD TV and old Blurays, or just turn down the brightness / DR on any new HDR TV.

There are however big name projector manufacturers doing HDR the way we're discussing it in this thread using a pair of stacked projectors, each covering their own dynamic range (and one with the brightness boosted relative to the other, or perhaps with an ND filter to dim one of them instead).

ps tone-mapping is inherently low-latency, it should take no more than 1ms to do a single frame using a shader let alone custom ASIC such as are used in televisions and projectors these days. It's all dedicated hardware. Even upscaling is trivial. I always get a huge laugh when people on the internet make claims that upscaling is some kind of complex mysterious magic that's going to always add a ton of lag, uhh no.

Re: Laser projectors general? [zero lag & zero blur!!!]

Posted: 24 Jan 2016, 00:10
by RLBURNSIDE
spacediver wrote:I believe that there are initiatives in place to do just this - algorithms that scale the mapping depending on the luminance range of the display. I remember reading or watching something about this recently.
What the new HDMI 2.0a specs do (and Dolby Vision as well) is encode in the videostream the metadata of what specs the reference display has that was used in the mastering stage. This allows the TV to do its best to give the most accurate picture it can, within its own specs. Essentially it's a form of automatic dynamic calibration, and I think it's done on a frame-by-frame basis if I'm not mistaken. And it's not just for luminance range but color gamut too. Very important to tell the TV what the display master's calibration settings were at the time of mastering. That takes all the guesswork out of TV calibration and opens up an era of fully automatic, dynamic calibration. That's pure win from a content producer's point of view. It means your audience isn't seeing your painstakingly made shots on some grossly distorted or uncalibrated TV.

Re: Laser projectors general? [zero lag & zero blur!!!]

Posted: 24 Jan 2016, 12:16
by thatoneguy
RLBURNSIDE wrote:Forget your old SDR TV set, it's toast in the era of UHD Bluray
That's what Display Manufacturers indeed want you to think
The fact is Standard Blu-ray is still being beaten by DVD's
Ultra HD Blu-ray is dead on arrival at this point

Re: Laser projectors general? [zero lag & zero blur!!!]

Posted: 24 Jan 2016, 12:21
by thatoneguy
spacediver wrote: HDR is probably the most important development in display technology since the transition to HD. It is far from a gimmick.
They said the same about 3D Televisions

Re: Laser projectors general? [zero lag & zero blur!!!]

Posted: 24 Jan 2016, 12:54
by aeliusg
thatoneguy wrote:They said the same about 3D Televisions
thatoneguy wrote:That's what Display Manufacturers indeed want you to think
The fact is Standard Blu-ray is still being beaten by DVD's
Ultra HD Blu-ray is dead on arrival at this point
Haven't we all realized at this point that market acceptance means fuck-all for the merits and absence thereof of these technologies?

Re: Laser projectors general? [zero lag & zero blur!!!]

Posted: 24 Jan 2016, 15:05
by spacediver
thatoneguy wrote: The fact is Standard Blu-ray is still being beaten by DVD's
In terms of market share, or in terms of image quality? The discussion here has nothing to do with market acceptance - we're discussing the technological merits.

Re: Laser projectors general? [zero lag & zero blur!!!]

Posted: 24 Jan 2016, 15:30
by RLBURNSIDE
thatoneguy, why don't you try posting something interesting or worthwhile reading instead of your snide cynicism? Being a cynic is lazy, actually trying to innovate is harder. That's what we're trying to do here, and it's pretty clear you're wasting not only our time but your own. I find your opinions ignorant and your tone reminds me of the Drunk Uncle on SNL, talking about the news. It would be funny if it weren't so sad.

If you're not interested in new display technology, I have to ask why you're even posting here, aside from the obvious (trolling).

Re: Laser projectors general? [zero lag & zero blur!!!]

Posted: 24 Jan 2016, 19:00
by flood
imo it's still a buzzword in the same way "HD" was a buzzword a few years ago. personally, it's nothing too exciting. e.g. hd was just adding more pixels, hdr is just adding more max luminance and bitdepth to go along with it. and when i'm indoors, i don't really care to see exceptionally bright highlights or reflections... i'll just go outside for that :D
RLBURNSIDE wrote: ps tone-mapping is inherently low-latency, it should take no more than 1ms to do a single frame using a shader let alone custom ASIC such as are used in televisions and projectors these days
depends on whether it's the gpu or the display doing it. on the gpu, yea nothing to worry about. but if it's the display, because it's not a local operation (in contrast to e.g. gamma adjustments where each pixel can be adjusted at a time), you need to load the single frame into a buffer before applying the operation, and then scan out the buffer onto the display.

Re: Laser projectors general? [zero lag & zero blur!!!]

Posted: 24 Jan 2016, 19:37
by spacediver
flood wrote:imo it's still a buzzword in the same way "HD" was a buzzword a few years ago. personally, it's nothing too exciting. e.g. hd was just adding more pixels, hdr is just adding more max luminance and bitdepth to go along with it. and when i'm indoors, i don't really care to see exceptionally bright highlights or reflections... i'll just go outside for that :D
I'd reserve judgment until you've actually seen one in person. While I haven't, there are a lot of industry insiders whose opinions I value who claim it is a remarkable revolution in image quality. I've also heard first hand from a friend in the industry who saw a dolby tech demo and was utterly blown away (this is a guy who has worked with multiple FW900s, and Dolby prm-4220's, in his mastering studio).

Re: Laser projectors general? [zero lag & zero blur!!!]

Posted: 24 Jan 2016, 23:49
by RLBURNSIDE
flood wrote:imo it's still a buzzword in the same way "HD" was a buzzword a few years ago. personally, it's nothing too exciting. e.g. hd was just adding more pixels, hdr is just adding more max luminance and bitdepth to go along with it. and when i'm indoors, i don't really care to see exceptionally bright highlights or reflections... i'll just go outside for that :D
RLBURNSIDE wrote: ps tone-mapping is inherently low-latency, it should take no more than 1ms to do a single frame using a shader let alone custom ASIC such as are used in televisions and projectors these days
depends on whether it's the gpu or the display doing it. on the gpu, yea nothing to worry about. but if it's the display, because it's not a local operation (in contrast to e.g. gamma adjustments where each pixel can be adjusted at a time), you need to load the single frame into a buffer before applying the operation, and then scan out the buffer onto the display.
Totally true but totally irrelevant. The frame is getting buffered in the display prior to arriving at the pixels themselves, since it has to undergo color conversion and many other tweaks and fullscreen passes prior to that. So you're already eating up one frame of lag inside the display, no matter what. In a CRT you could easily display pixels as soon as they arrive down the wire, thereby cutting down the lag to near-0 (assuming a fixed framerate. VRR is different because there it's much harder to vary raster scanning speed mid-frame without introducing artifacts).

The time it takes a custom ASIC to do upscaling or tone mapping is far less than a general purpose CPU or even GPU. You can effectively bake in only the operations you need into the chip and do it extremely efficiently. On a GPU what might take a screenspace GPU 1ms might take a custom ASIC 0.1 ms. Since you're already eating up the time to buffer the incoming frames, processing them at that point is just a matter of stacking each operation until it's ready for the pixels. And that lag can add up but it doesn't need to be higher than a few ms, tops, even with upscaling. Anything relating to motion dejuddering or interpolation or smoothing will introduce a couple frames of lag, though. But most people leave those settings off. And for games you basically have to.

And no...HDR is nothing like HD vs DVD. The human eye is FAR more sensitive to changes in luminance than chroma, and that means HDR is an extremely efficient way to increase perceptual resolution, for the same reason that a 720p plasma looks sharper than a 1080p LCD : contrast. HDR is not a gimmick, it is THE upgrade you should be looking to get, now that plasma's dead you should try to get HDR OLED if not HDR FALD TVs and SDR is at this point obsolete garbage.