[Overcoming LCD limitations] Big rant about LCD's & 120Hz BS

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
anothergol
Posts: 11
Joined: 28 Jul 2017, 23:47

[Overcoming LCD limitations] Big rant about LCD's & 120Hz BS

Post by anothergol » 29 Jul 2017, 00:00

First, hello everyone, since I've just registered only to post this.

Where to start... I may be a little biased here, because I come from the CRT era, I saw LCD come up, thought that no one would want this crap that only had the advantage of being less bulky, and yet they did took over our nice CRT's...

Don't get me wrong, I love LCDs for what they allowed, namely laptops, smartphones & other small devices. That's not something CRT's would have done. And, while I hope OLEDs will fully replace LCD's in the not-so-far future, these owe their existence to LCD's anyway.
So I'm not worshipping CRT's, the OLED screen on my smartphone is pretty much perfection, the only thing a CRT still does better is multi-resolution, but with very high pixel densities even that becomes less significant, and even CRT's have a prefered resolution anyway.


So, for years 2 monitors have been sitting on my desktop, a Neovo F-419 LCD, which I like for reading text, & a ViewSonic g90f+, which gives eye orgasms in games.
It's not just the refresh, it's also the slight blur/bleeding, which isn't so nice for text, but certainly better for games. That's also why old arcade games look pretty bad today without a good CRT emulation, blur & visible scanlines make poor & low-res artwork look better. Not saying this is a "good feature" though, and I'm glad my CRT doesn't have visible scanlines. In fact, the LCD I hoped to replace it with, does have more visible scanlines.

I kept hearing about 120Hz in games, that people could see the difference, all that crap. And I thought that -maybe- I was really missing something here. Monitors evolved, but surely our eyes didn't? While I can clearly see 60Hz blinking in peripheral vision, starting from 70Hz it's pretty much perfection to me. So what would I see above that? I had already tried the 85Hz that my monitor supports, well, a smooth scrolling didn't feel any smoother.

So I bought a new "gaming" monitor, first assuming that yes, these days we could really game on a LCD as people were saying, and I went for a 120Hz+ one, to see that difference that everyone was claiming to see.
I picked a Samsung CFG73. I wanted a pixel density around 90PPI - I'm not conservative at all about pixel density, but Windows apps are. So for me 90PPI is still better today, 180PPI a good alternative as it allows doubling pixels without interpolation, but anything in-between is a bad idea - which ruled out the DELL S2417DG for me.

I then installed it. I like 4/3, but I can understand how every monitor is now 16/9, since pretty much all content is made for that.
Did I see that difference that ever gamer claims to see, at 120Hz? Hell yes.............. BUT ON THAT SAME MONITOR!
Was it smooth? Yes. Definitely. Not as perfect as on a CRT, there was still slight ghosting, but it was ok, acceptable. But at 120Hz! It was doing nearly the same as a CRT at 60Hz, but starting at 120Hz! I was at least hoping that the monitor would have given the same result as a CRT at 60Hz, and that I would have discovered the magic of 120Hz that everyone was raving about. BUT NO!
So what was in for me? Nearly the same as I already had on my CRT, except that to get this, my graphic card had to work twice as hard to produce twice more images. GREAT.

I just don't get it. How is this a technical evolution? There is no need to hype 120Hz. Our eyes don't need it.
From what I understood, the only reason an LCD is still not able to do as good as a CRT, is that it can't be as bright? That is, it would have to have a very bright image for a very short time, and let it fade out until the next frame. But since it can't be that bright, a black frame is inserted twice more frequently, so the source has to spit out twice more images, and these cannot be twice the same images, am I right?
I only got this recently. I thought that black frame insertion really allowed an LCD to do like a CRT, at 60Hz.
Well that just sucks. But if this is right, it means that this is also a problem for OLEDs. Well...

In conclusion, to me that 120Hz crap is pure hype, it's only a little step forward after the large step backwards that the LCD was, and for a gamer, a CRT is still much better.

It would be interesting to see 120Hz on a CRT, though. Perhaps that would be different, because aferall, more images are produced, thus the result should be closer to real motion blur. I haven't noticed this on the CFG at 120Hz, though. To me the result was simply the same as on my CRT at 60Hz.
And something I also haven't understood, is how graphic cards aren't able to emulate refresh rate scaling. My NVidia is already capable of fake resolution, with post downscaling. I don't see the problem for the graphic card to report 4x the refresh rate, blend the 4 images together, and output that. Or maybe it has been tried and it looks bad? Maybe it's a naïve thought, but it makes sense that, while we shouldn't be needing monitor refresh rates above 70Hz, a refresh rate would benefit from being higher, or well, infinite. I dislike motion blurring in games, but those are post effects and at a too low rate, thus that's what they do, they too visibly blur. But imagine a 700Hz framerate, blending (simple or advanced, but you can't really do miracles out of stills), and output at 70Hz, why wouldn't this give good results?


Side note, I've already returned the CFG73, because it's buggy (monitor hangs [for real] after power saving. Went for CFG73 instead of CFG70 to be on the safe side, apparently it was a bad idea).
Also, the bad "text clarity" that this monitor was reported to have, is real. And it's pretty bad, really not acceptable to me. Draw a perfectly antialiased disk, the lower half is all blurry. And this monitor was praised for its image quality?

/end of rant

User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Re: Big rant about LCD's & 120Hz BS

Post by RealNC » 29 Jul 2017, 05:14

Just because you can't see a difference between 60Hz CRT and 120Hz LCD doesn't mean there isn't one. I can see the difference very clearly. CRT@60Hz has less motion blur than non-blur reduction mode LCD@120Hz. CRT wins there. But fluidity of motion however is much higher on the 120Hz LCD.

But why on earth would you run your CRT at 60Hz? Even the cheapest CRTs usually support at least 100Hz.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Big rant about LCD's & 120Hz BS

Post by Chief Blur Buster » 29 Jul 2017, 08:03

Firstly, yes there's a lot of BS about technologies. No disagreement there.
And CRTs, indeed, are better in many ways!

That said...to demystify 120Hz LCD BS....
RealNC wrote:Just because you can't see a difference between 60Hz CRT and 120Hz LCD doesn't mean there isn't one. I can see the difference very clearly. CRT@60Hz has less motion blur than non-blur reduction mode LCD@120Hz. CRT wins there. But fluidity of motion however is much higher on the 120Hz LCD.
RealNC is correct.

In fact -- full-persistence LCD 240Hz has more motion blur than 60Hz CRT.

Refresh rate is not the full cause of motion blur -- it is also how long each refresh cycle is visible for, too.

There are many CRT long-time users who were newbies to 120Hz, try 120Hz, and are dissapointed. True! They see their motion blur is only a tiny bit better than a 60Hz LCD, when they haven't witnessed the wonderful world of strobe backlights yet (or perhaps, they selected a poor quality one like an older version of low-brightness LightBoost with crappy colors).

A properly-strobed LCD 120Hz/144Hz can manage to roughly match a CRT.

LCD 60Hz Image

LCD 120Hz Image

Strobed LCD Image

You need some of the better brand/tradenames of the strobe backlight technologies (roughly 85% black of the time) to at least match CRT, and this is needed with certain displays, see 60Hz vs 120Hz vs LightBoost

Brand X of a strobe backlight can be very crappy, while brand Y of a different strobe backlight can be better.

Various brand names of strobe backlights can go under these:
"ULMB" -- NVIDIA (official)
"LightBoost" -- NVIDIA (hack)
"Turbo240" -- Eizo
"MOTION240" -- LG
"DyAc" -- BenQ/Zowie

Some of them have crappy strobe crosstalk, and others have amazing CRT-clarity motion.

Image versus Image versus Image

In many of them, you need to make sure framerates match refresh rates (120 frames per second at 120 Hz) since the manufacturers tend to avoid support for good-quality strobing at 60Hz. And, in many of them, colors are very poor (not all of them) or brightness is poor (not all of them).

Regular 50%:50% black frame insertion do not make LCD better than CRT. You need a bigger black duty cycle than that, and that's done via strobe backlight, which can produce a larger black period between shorter-persistence frames. More similar to CRT.

Image

Mathematically, 1ms of frame visibility time translates to 1ms of motion blur during 1000 pixels/second motion. So poor-duty-cycle BFI won't do too much good, 50%:50% flicker duty cycle on 120Hz only gives you 1/240sec persistence (4ms worth of motion blur), that's still more than the <1ms of a good CRT phosphor. However, there ARE strobe backlights that can flash a backlight briefer than that. A good strobe backlight can have one-quarter the motion blur of a non-strobed 240Hz LCD.

Non-strobed 120Hz LCD = only 50% less motion blur than 60Hz
Non-strobed 240Hz LCD = only 50% less motion blur than 120Hz
etc.

Many old-school CRTs need to do a double-take to realize the motion blur math works two ways:

Motion blur is proportional to frame visibility length.
........That's the refresh cycle length on non-strobed displays
........That's the flash length (or BFI duty cycle) of a strobed display.
........That's about the CRT phosphor decay length on a CRT display.


(Some of it is fudged by curves -- e.g. phosphor decay curve -- the majority of the brightest illumination part of phosphor cycle, which decays very quickly, creating a very short effective impulse length. Persistence of phosphor is usually measured to ~90% fade, which is often within the first 1ms of most CRT phosphors).

Vision scientists nowadays generally agree on the relationship of visibility length to motion blurring -- and camera shutter is a good relationship to this too (a 1/120sec shutter has more motion blur than 1/1000sec when taking sports photography). The frame visibility length combined with eye tracking, has a very similar perceived display motion blur relationship. CRT phosphor fade is very fast, so it's like a fast flash (flash photography or fast shutter) -- 1ms is 1/1000sec.

Bleah -- a 1/1000sec flash of each 60Hz refresh has less motion blur than 1/240sec flashes (no black periods) of each 240Hz refresh. You're comparing 1ms flash with 4ms flashes.

See where I'm getting at? Refresh rate doesn't tell the whole motion blur story.

Which is exactly why 240Hz LCD (non-strobed) has more motion blur than a 60Hz CRT.

As you track your eyes on moving objects, your eyes are in different positions at the beginning of frame visibility and end of frame visibility, that's what creates motion blur -- like http://www.testufo.com/eyetracking ........ 70Hz is only "enough" if you don't mind flicker. Unfortunately when you remove flicker (impulse driving), motion blur comes back. Even instant-response (0ms) will still have lots of motion blur if the refresh rate is only 70Hz, because it's similiar to a 1/70sec shutter while waving around a camera.

CRTs have better black levels, can do any resolution (no fixed pixel array), and the softening of the phosphor look is more well-suited for the old types of games, lower resolutions. There is really no need to replace those with a 120Hz LCD as many of them will not benefit from 120Hz, if they are 60fps-locked. Unless, you're using some form of black frame insertion as a hack to force a 120Hz strobe backlight to function at 60Hz (with its attendant side effects). Remember, you have to combine BOTH a strobe backlight AND software-based black frame insertion (simultaneously). The strobe backlight shortens the per-refresh-cycle persistence, and the black frame insertion converts 120Hz strobing into 60Hz strobing. So if you try only one or the other with an emulator, you WILL get a crappy experience. This problem exists (and workaround) because manufacturers do not want to create 60Hz strobe backlights.

Now, when we're talking about MODERN games, then MAKE SURE to run them at a framerate matching refresh rate (120fps at 120Hz) to get the full benefits of the full refresh rate.

TL;DR: The bottom line is a first experience with strobe backlight technology can be quite a bad experience, e.g. buying a random monitor and trying it out. But do not jump to conclusion by just 1 experience. Especially if you tried an LCD without a strobe backlight. There are indeed LCD displays, with the right brand of a strobe-backlights, that have literally zero motion blur, like a RT. That said, CRTs reign supreme for playing old arcade games!
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Big rant about LCD's & 120Hz BS

Post by Chief Blur Buster » 29 Jul 2017, 08:48

+2nd reply:
anothergol wrote:But imagine a 700Hz framerate, blending (simple or advanced, but you can't really do miracles out of stills), and output at 70Hz, why wouldn't this give good results?
For non-flicker displays, that won't work because whatever blended result, the blended result is static on the screen for a full 1/70sec.

As you track moving objects on a screen, your eyes are in a different position at the end of a 1/70sec period than beginning.

If it's a non-flicker, non-strobed, non-impulsed (aka not a CRT), then at 1000 pixels/second (half-screen-width per second at 1920 x 1080), 1/70th of 1000 equals about 14 pixels. That's how far your eyes tracked in 1/70sec -- creating 14 pixels of of motion blurring. (like panning a camera while having a 1/70sec shutter).

You need to flash the frame quicker (like a fast shutter or flash) like a CRT in order to kill the motion blur. The briefer the better. (Again: Remember, simple 50% black frame insertion (50% on, 50% off) only reduces motion blur by 50% -- and is not a miracle by itself). You can avoid the need for flicker/strobing/etc by doing more frames / more refreshes instead -- e.g. 120fps @ 120Hz, 240fps @ 240Hz, etc.

When you look at the below animation on an OLED or LCD or a non-flicker display (non-CRT), in a modern GPU-accelerated web browser (60 frames per second), then two UFOs below produce very different-looking backgrounds depending on what your eye is tracking.



This optical illusion helps explain frame visibility length equals the minimum length of motion blur.
(Even with 0ms instant response -- see Why do some OLEDs have motion blur?).

If you look at this on a CRT, it looks very different than it does on most OLEDs and LCDs. The only time that this optical illusion looks the same as CRT, is when a strobe backlight is used (e.g. NVIDIA ULMB -- Ultra Low Motion Blur).
anothergol wrote:the OLED screen on my smartphone is pretty much perfection
Yes, it is darn near perfection except for motion blur.

Can your phone's OLED pass the TestUFO Panning Map Readability Test during 960 pixel per second panning? You can easily do it on your CRT, but you can't on an OLED unless the OLED is flickered (impulse-driven). This is exactly because of I've explained. Frame visibility length equals motion blur during eye tracking situations. (The artificial human invention of using static frames to represent moving images...)

Yes, your phone screen is small so not much eye tracking. It will be hard to see the labels, but you can also pinch-zoom a little (it also speeds up the motion since the pixels are scaled bigger, so the test remains still valid at 960 pixels per second). Try the same on a CRT, you easily pass this test. CRT still better than your phone's OLED in motion blur! But it doesn't matter for phones because the screens are small.

But when OLED screens are big, desktop sized, eye-tracking motion blur actually becomes a big problem during refresh rate limitations on non-impulse/non-CRT/non-strobed displays. You need superhigh Hz _or_ you need ultra-brief flicker, to kill that motion blur you see.

Even the Dell 4K OLED needed to be impulse-driven (flicker) to eliminate motion blur. And not everyone liked the flicker!
(From Custom OLED Rolling Scan Thread)

Regardless, CRT is amazing for a lot of things. However, a properly well-strobed LCD can also be amazing for the modern games where I can run at 120fps @ 120Hz (strobed) with a powerful GPU. For older games, it's, however, hard to beat a CRT, because of the 'look' . (Tricks like MAME HLSL + strobing + BFI hack (only when all 3 combined) can help quite a lot, but only as a way of utilizing an existing monitor, _not_ replacing a CRT).

I have used CRTs at 120Hz too. It actually greatly improves phantom array effect (less stroboscopic effect, less wagon wheel effect), but has no effect on motion blur -- because CRT persistence is unaffected.

The change to strobing is a far more massive jump in clarity. Going from LCD 120Hz->LCD Strobed is actually a bigger difference to my eyes than LCD 60Hz->LCD 240Hz NONstrobed. But remember, do not forget, some strobe backlights have bad colors, are poor brightness, or have bad strobe crosstalk. They come with cons even when they make the magical zero-motion-blur LCD possible today (for framerate-refreshrate matched motion situations, 120fps@120Hz).

Wonderful CRTs -- are darn near nigh impossible to beat in all categories simultaneously (e.g. colors AND brightness AND motion clarity etc etc) -- such as Sony FW900, GDM-W900, Nokia 445 Pro CRTs -- Some of them are darn near nigh impossible to beat in many use cases.

Stick to CRT for your use cases. But also acknowledge the need for better understanding of LCD's flaws. Our site exists -- raison d'etre -- to help de-mystify "LCD BS" about motion blur, etc. Since not everybody (not even all display manufacturers) understand how to remove motion blur from LCDs, for example. Our site exists in large part due to that. "Plain LCD 120Hz' just ain't enough for CRT users (including me).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

anothergol
Posts: 11
Joined: 28 Jul 2017, 23:47

Re: Big rant about LCD's & 120Hz BS

Post by anothergol » 29 Jul 2017, 23:22

Chief Blur Buster wrote:Firstly, yes there's a lot of BS about technologies. No disagreement there.
A properly-strobed LCD 120Hz/144Hz can manage to roughly match a CRT.

In many of them, you need to make sure framerates match refresh rates (120 frames per second at 120 Hz) since the manufacturers tend to avoid support for good-quality strobing at 60Hz. And, in many of them, colors are very poor (not all of them) or brightness is poor (not all of them).
[/size][/b]
Remember, it's a Samsung CFG73 I tested, it has strobing, and yes, aside from LCD-specific artefacts (the inversion artefacts test, it sucked at it), at 120Hz it was close to my CRT.

But are you saying that if it sucked at 60Hz, it's not for technical reasons, just that they didn't care? Mmmh, so there are LCD's out there that will do a CRT-like job at 60Hz then?

Chief Blur Buster wrote: If you look at this on a CRT, it looks very different than it does on most OLEDs and LCDs. The only time that this optical illusion looks the same as CRT, is when a strobe backlight is used (e.g. NVIDIA ULMB -- Ultra Low Motion Blur).
[/size][/b]
yep, while on my old Neoveo the difference is obvious, on the Samsung it was pretty much the same. But again, it was 120Hz, which is where the whole problem is for me.

Chief Blur Buster wrote: I have used CRTs at 120Hz too. It actually greatly improves phantom array effect (less stroboscopic effect, less wagon wheel effect), but has no effect on motion blur -- because CRT persistence is unaffected.
[/size][/b]
Ok, I can probably agree on less flicker. Even though I set mine to 75Hz, because above 70Hz I stop seeing flicker (like everyone I do, for neon tubes), I can imagine it varies. If I put mine at its max (85Hz) I can't see the diff, though.

Less wagon wheel effect is what I imagined. So, wouldn't what I described work?
-graphic card fakes a virtual framerate
-each set of X frames are blended together & output at 60Hz


Anyway, what I've been saying in my post was about all 3:

1. less motion blur. My rant was about this requiring 120Hz, as opposed to my CRT that does this perfectly at 60Hz. Not a detail at all, asking to render 2x more frames is pretty much 2x less detail.
But here you're saying that this is the Samsung's fault, that LCD strobing can work at 60Hz when done properly. If that's true, shame on Samsung for releasing this as a "gaming" monitor.
But then again, wouldn't a proper strobing at 60Hz to give CRT-like results, reduce the brightness quite a lot?

2. less stroboscopic effect & less wagon wheel effect.
For the flicker that is part of this, for me above 70Hz I stop seeing flicker, but I seriously doubt that some people need 80Hz or more to be more confortable with this, & certainly not 140Hz.
For the smoother motion, this is what I was really expecting to see. Twice more frames were produced, so it did make sense. But I wasn't.. Or I didn't feel it, I don't know.
Or maybe I did, and it was worse, and I thought this diff was another LCD defect. You know, you wave a finger back & forth in front of your eye rapidly, you see pretty much a filled shape. Pretty much what you'd see with a fake infinite framerate output at 70Hz, I'm sure. But the question is, is this really better for gaming? In a game, you would see a steppy motion of 1 finger, but it would be 1 clear, sharp finger. On one hand, I would go for the blurred motion in a movie, but I may go for the sharp, more clearly identifiable object during motion, in a game, no?

But for real: I do not believe that one needs 10k frames per second to properly reproduce the filled, solid shape that a rolling wheel/top produces. Sure, produce 10k frames, output them to a 10kHz monitor, the result should theorically be perfect. But our eyes aren't gonna improve any time soon, and I truly believe that you can blend 10K/70 frames together, output this at 70Hz, and the result will be identical. Until I've seen it, I won't believe in the benefits of 120Hz, other than for an LCD to do pretty much the same as what a CRT does at 60Hz.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Big rant about LCD's & 120Hz BS

Post by jorimt » 30 Jul 2017, 13:10

One point here @anothergol,

You're at the risk of conflating motion clarity with refresh rate.

I think there is no question a CRT at 60Hz has less motion blur than even the best LCD gaming panel running at 120Hz non-strobed (due to lack of image persistence between frames), but this doesn't change the fact that the CRT in this instance only updates 60 times per second, and the LCD updates 120 times per second.

While the LCD may have more motion blur, these extra updates provide an undeniable perceivable responsiveness advantage in gaming, more motion blur or not; motion blur and refresh rate are not the same thing.

While gaming on my 144Hz IPS in G-SYNC mode, for instance, I can clearly differentiate the drop from 85+ Hz down to the mid 70's, 60, the 50's, 40's, and the 30's, and it isn't the motion blur either, as I can feel it just running straight without triggering any up/down or left/right movement; It's the screen updating at a slower rate.

While the jump from 60Hz to 120Hz feels massive to me, it is true the further jump from 120Hz to 144Hz or higher is much more difficult to discern. In fact, I'd say anything after 85Hz in gaming (non-strobed) starts to feel more and more similar in fluidity, with input response being the only thing that is obviously improved at higher Hz.

I can however feel the difference between 85Hz and 144Hz on my desktop. I know because it has become stuck at that refresh rate before on a restart, and I had wondered why things felt less responsive only to find that it was at 85Hz.

So there is a discernible difference in responsiveness between 60Hz and higher refresh rates, even on an LCD; motion blur is just one aspect.

Finally, regarding your Samsung CFG73, I believe that is a VA panel type, which is the worst in motion clarity among the three panel types most commonly available in monitors (motion clarity: TN > IPS > VA).

I'd advise you to try a high-end TN (ultimate response) or IPS (best picture) gaming panel next time before making final judgments on the motion clarity of modern gaming monitors. A complete list of gaming monitor can be found here, for future reference.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Big rant about LCD's & 120Hz BS

Post by Chief Blur Buster » 30 Jul 2017, 13:54

anothergol wrote:But are you saying that if it sucked at 60Hz, it's not for technical reasons, just that they didn't care? Mmmh, so there are LCD's out there that will do a CRT-like job at 60Hz then?
Yes, but very few.

For 60Hz, the biggest problem is manufacturers have artificially disallowed low flicker rates on strobed LCDs.
(flicker discomfort, epilepsy liability concerns, user complaints, etc).

They enforce things like minimum strobe rates of 75Hz, 85Hz or higher.

That's why we rarely see 60 Hz CRT clarity in current strobe-backlight LCDs at the moment.

For example, you will see monitors with ULMB often only supports 85Hz, 100Hz and 120Hz during strobe modes. Which works perfectly for computer games running at frame rates matching refresh rates (e.g. 100fps at 100Hz). But no 60Hz strobing, means bad for 60Hz content (video, console games, emulators, etc).

One of the few displays that does is the older BenQ XL2720Z with certain versions of firmware (Version 2 to 5 at least).

Blur Busters was instrumental in discovering an unadjustable persuading BenQ to release highly customizable (see BenQ/Zowie press release mentioning Blur Busters). We created this Strobe Utility for free (at our own expense) for BenQ/Zowie monitors, as fans of CRT motion clarity. While overly complicated for an average user, it provides more choices for users to overcome various limitations of LCDs. (And as readers may figure out, obviously, as an Amazon affiliate (and other online stores), which earns back a little bit via sales of monitors to recouped the costs of developing Strobe Utility)

For the first time, we finally saw 60Hz CRT motion clarity on LCDs through the Version 2 Firmware for XL2720Z. It flickered a lot, but (when combined with large vertical totals) was nearly strobe-crosstalk-free in emulator usage with custom timings + Strobe Utility tweaks. Since scan acceleration was done via NVIDIA (large vertical totals), some crosstalk came back when I swapped inputs to a Sony Playstation or an XBox One console (since HDMI uses 1080p with a Vertical Total of 1125 -- which leaves only a scant ((1125-1080)/1080)th of 1/60sec (only 0.69 millsecond) between blanking intervals to let LCD GtG pixel response limitations settle in the dark periods between strobes.

Many strobe backlights such as LightBoost and ULMB buffer their refreshes and do accelerated scan-outs of individual refresh cycles, for more time for LCD GtG pixel response to finish in dark. But LightBoost and ULMB are artificially limited to higher refresh rates (see manufacturers not wanting to permit 60Hz flicker).

So we couldn't have our cake and eat it too, since the near-crosstalk-free 60Hz strobing achieved on XL2720Z was achieved only with large vertical total tweaking (GPU-side scanrate speedup, instead of monitor-side scanrate speedup). We were lucky that GPU-side scanrate speedups were possible at all with the BenQ/Zowie gaming monitors. This eliminate the need for a fullframe buffer in the monitor (as a method of scanning-out the rows of pixels faster than what's coming over the video cable). XL2720Z is one of the monitors that real-time initiates LCD pixel transitions in sync with the graphics output, and using custom timings tweaks to create large blanking intervals (which results in a faster scan-out of visible resolution, with longer pauses between refresh cycles). This became popular around Blur Busters as a method of reducing strobe crosstalk specifically on one brand of blur reduction (specifically, BenQ/Zowie) -- an advanced tweaking feature for sure, but at least it was made possible at all. This feature was discovered by us by accident, that using large vertical totals reduced the amount of strobe crosstalk, by allowing LCD GtG to occur more between refresh cycles. An advanced explanation of this is explained in the second half of the Strobe Crosstalk FAQ page.

I do honestly wish manufacturers would make 60Hz strobing more widely available. For now, we're stuck with various tricks (software BFI combined with hardware strobing to convert 120Hz strobing to 60Hz strobing) in order to achieve the wanted CRT clarity.

As a fan of CRT, we realize strobing doesn't solve all of LCD problems -- Yes, LCD artifacts are still a big problem, and some strobe backlights really definitely do amplify LCD artifacts. Some of them are surprisingly good, and as I've feasted my eyes on dozens of strobe-backlight technologies, some really get darn close.

I just wish there was more low-side refresh rate flexibility (< 75Hz) for single-flash strobing. Double-flash 60Hz strobing (And also 60fps @ 120Hz strobed) creates double images much like CRT 30fps @ 60Hz.

Some displays such as Sony Motionflow Impulse (the strobe-only mode), on certain Sony HDTVs, manage to achieve single-strobe and 60Hz motion clarity with consoles very well. Better at 60Hz than many of the gaming monitors, because manufacturers don't always permit low-rate strobe modes to be enabled in their firmwares.

Cheers,
Mark Rejhon
Chief Blur Buster
Pages: About Mark Rejhon / About Blur Busters
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Big rant about LCD's & 120Hz BS

Post by Chief Blur Buster » 30 Jul 2017, 14:13

anothergol wrote:But here you're saying that this is the Samsung's fault, that LCD strobing can work at 60Hz when done properly. If that's true, shame on Samsung for releasing this as a "gaming" monitor.
Correct.

But manufacturers artificially limit strobing from being used at 60Hz.

See above explanation.
anothergol wrote:But then again, wouldn't a proper strobing at 60Hz to give CRT-like results, reduce the brightness quite a lot?
The good news is that it's easier to use a longer strobe-flash to compensate.

At 60Hz, you can simply scanout the refresh cycles faster (e.g. scanout a 60Hz refresh cycle, in say, 1/144sec) -- creating several milliseconds of pauses between refresh cycle to do two things (1) Wait for LCD GtG pixel transitions to finish in the dark, and then (2) do a longer flash.

(Longer flashes do create a little bit more motion blur, in exchange for more light output).

60Hz is 16.7 milliseconds per refresh cycle, and 144Hz is 6.9 millseconds per refresh cycle. If you do scanout acceleration (60Hz mode with 1/144sec scanout velocity and a humongous VBI interval in between), you can create (16.7ms - 6.9ms) = 9.8ms of idle time between 60Hz refresh cycles.

Diagram from http://www.blurbusters.com/strobe-crosstalk

Image

In the past slow GtG on older LCDs ghosted so much that multiple refresh cycles noticeably overlapped like this:

Image
This is a 15-year-old LCD display -- 3 refresh cycles are overlapping, 06, 07 and 08 blended together. This LCD will not work very well with strobe backlights. And the GtG zone is not "in-sync" for all colors (it's all spread out, some color-pairs slower than others) -- look at how the GtG zone for the zero is different than the GtG zone for the background. (Image from the Electronics Hacking: Creating a Strobe Backlight page)

Nowadays, for newer monitors (Newer 1ms and 2ms TN especially), a larger percentage a larger color-pairs of GtG is completed with more fainter GtG aftereffects that still remain for many milliseconds (hopefully faint enough to be not noticeable by human eye -- as seen in some of the 'better' strobe backlight monitors with less strobe crosstalk).

The great thing is that it's easier to have higher-quality at lower refresh rates (e.g. 60 Hz) but the catch-22 is manufacturers artificially disallow 60 Hz strobing.

The faster the GtG, the shorter the height of the LCD GtG zone. GtG 1ms is not perfect, but that's the measurement from GtG10% transition (a faint begin) through GtG90% (a faint ending). On newer monitors, enough LCD GtG is completeable in the pause period between refresh cycles, that CRT-like strobing is made possible at least on the faster LCDs on the market. (LCD artifacts nonwithstanding, of course....)

Also, fixing brightness limitations can be done by a much brighter backlight or edgelight. That costs money though, but nobody's stopping from manufacturing a super-powered heat-sink-cooled (fan-cooled even!) edgelight that strobes a gaming monitor at 1000 nits HDR-friendly with a 90% persistence reduction (E.g. 0.8ms strobe flashes at 120Hz at 10,000 nits to generate an average 1,000 nits output).

(Interesting note: In high speed photometry tests, some CRTS ultra-briefly outputs more than 10,000 nits in a square millimeter at the electron gun spot. This overexposes even light-insensitive high speed video cameras -- example video .... Phosphor shines insanely brightly at the CRT electron gun spot -- that's why CRTs aren't dim at average picture level)

So 60 Hz strobing is technologically easier, but manufacturers don't enable strobing at 60 Hz...
jorimt wrote:While the LCD may have more motion blur, these extra updates provide an undeniable perceivable responsiveness advantage in gaming, more motion blur or not; motion blur and refresh rate are not the same thing.
Indeed, correct, they are not.

However, if you adjust variables in a certain way, you can create a situation where motion blur is inversely proportional to framerate:

Motion Blur From Frame Rate (Refresh Rate)

Assumptions:
(A) The display has a more-accurate-than-average GtG that's also an insignificant percentage of refresh cycle (e.g. 1ms is insignificant compared to 16.7ms refresh cycle). This includes 0ms GtG, instant pixel response.
(B) The display is sample-and-hold, there is no strobing (strobing is disabled)
(C) You run at a framerate exactly matching refresh rate (perfect VSYNC ON)

Which can create a unity between frame rate and motion blur:

60 fps @ 60 Hz = ~16 pixels of perceived display motion blurring at 960 pixels/sec
120 fps @ 120 Hz = ~8 pixels of perceived display motion blurring at 960 pixels/sec
240 fps @ 240 Hz = ~4 pixels of perceived display motion blurring at 960 pixels/sec

Observe the disadvantage: You need framerate matching refresh rate. Harder to do at higher Hz!

Mathematically, for every 1ms of visible frame duration time, add 1 pixel of motion blurring during 1000 pixels/second (caused by eye tracking). TestUFO tries to follow this motionspeed standard, but uses 960 for simplicity because it's evenly divisible by 30, 60, 120 and 240.

However, once a variable changes from the assumptions,
-- strobing or impulse-driving (like CRT) -- this remove the sample-and-hold effect.
-- stutter/tearing of any kind (game stutter, mouse microstutter, common VSYNC OFF, etc)
-- poor/inconsistent GtG (especially GtG that eats a large percentage of a refresh cycle)
-- Artifically-injected game motion blur (that adds to perceived motion blur)
-- inequality between frame rate versus refresh rate

This unity (framerate duration = motion blur) disappears, with either better or worse motion clarity (depending on what variable changed).

Motion Blur From Strobe Lengths

You can achieve similar with strobe lengths too.

Assumptions:
(A) One strobe flash per refresh cycle. (Unfortunately, many manufacturers double-strobe at low Hz)
(B) Large majority of human-visible LCD GtG is crammed into the dark period between strobes
(C) Frame rates matching refresh rate exactly.
(D) Squarewave strobing

Will create this situation:

For 60Hz (proper single-strobed only ala XL2720Z V2):
60 fps @ 60 Hz with full illumination = ~16 pixels of perceived display motion blurring at 960 pixels/sec
60 fps @ 60 Hz with 1/120sec strobe = ~8 pixels of perceived display motion blurring at 960 pixels/sec (And 1/2 brightness)
60 fps @ 60 Hz with 1/240sec strobe = ~4 pixels of perceived display motion blurring at 960 pixels/sec (And 1/4 brightness)
60 fps @ 60 Hz with 1/480sec strobe = ~2 pixels of perceived display motion blurring at 960 pixels/sec (And 1/8 brightness)

For 120Hz
120 fps @ 120 Hz with full illumination = ~8 pixels of perceived display motion blurring at 960 pixels/sec
120 fps @ 120 Hz with 1/240sec strobe = ~4 pixels of perceived display motion blurring at 960 pixels/sec (And 1/2 brightness)
120 fps @ 120 Hz with 1/480sec strobe = ~2 pixels of perceived display motion blurring at 960 pixels/sec (And 1/4 brightness)

(Note: Some strobe backlights use voltage-boosting to compensate a little -- some LEDs can be pulsed about 2-3x brighter than when illuminated steadily. Marc Repnow (Aka Strobemaster) of Display Corner discovered this feature about LightBoost, see his display corner article in his disassembly/teardown of a LightBoost monitor.)

However, once a variable changes from the assumptions,
- Double strobing
- Framerates mismatch with refresh rate
- Stutters/tearing of any kind
- Excessive strobe crosstalk that is now human-visible
- Artifically-injected game motion blur (that adds to perceived motion blur)

Then this creates other side effects (strobe-crosstalk or perfect double-image effect, "jitteriness"/etc) that affects this equation.

Note: Most strobe backlights in most gaming monitors aim at 1ms-2ms strobe flash time, as a compromise (amount of time between refresh cycles to strobe, versus brightness), while some are adjustable to below (0.25ms) and/or higher (e.g. 3-4ms). At 0.25ms, motion clarity beats a lot of CRTs but it's also often more than 10x dimmer than the CRT. (...we'd probably need a heatsinked even brighter strobe backlight to compensate fully for that. Costs $$$.)

Bottom Line: Added Perceived Motion Blur is Proportional To Frame Visibility Time

For all the above, it's consistent: Perceived display motion blur (Caused by eye tracking) is directly proportional to frame duration length. This motion blur is additive (on top, above-and-beyond) to existing blurring/ghosting by GtG, existing blurring by game (GPU blur effects), or any other forms of artificial/natural motion blur.

There's only two ways to easily shorten frame visibility:
-- Add more frames (if not strobing)
-- Add more black period between frames (if strobing)

The mathematical relationship is easily seen at TestUFO Eye Tracking Motion Blur and TestUFO Black Frames Insertion. The higher the Hz and the faster the GtG, the more accurate the (above) described mathematical relationship is.

Note that while Software BFI is only a poor emulation, it is highly educational anyway. Remember software BFI can only add black periods in full refresh cycle granularity, however, 60fps @ 240Hz TestUFO BFI (duty cycle of 75% black, 25% visible) looks darn near identical in motion blur to 240fps @ 240Hz non-BFI. Same for 60fps @ 120Hz TestUFO BFI (duty cycle of 50%:50% black:visible) looks just like 120fps @ 120Hz. Now, only strobe backlights can do sub-refresh-cycle-length blackness, but still, the TestUFO software BFI is a good educational example of the mathematical relationships mentioned above in this post.

This mathematical perfection is beautifully apparent on TN gaming monitors with very well-calibrated 1ms GtG (that actually does all 256x256 color combos by roughly 90%+ within 1ms) -- as seen on many newer TN eSports gaming monitors I've tested.

Alas, real life can be a bitch, as they often inject variables (e.g. stuttery game). One has to play butter smooth games (Not many games can produce the "Super Mario Super Smooth Scrolling Effect" at 120fps @ 120Hz) or make sure they use a 1000Hz mouse + VSYNC ON + powerful GPU, to successfully achieve stutterfreeness at 120fps @ 120Hz, to witness such motion blur relationships.

I see this beautiful mathmetical relationship occuring in some older games like in Half Life 2 on a 240Hz TN monitor, where strafing in front of a wall during VSYNC ON at 240fps @ 240Hz has about 1/4th the motion blur of 60fps @ 60Hz. Now, that's a lot of GPU power needed if doing such framerates in newer games, so strobing is an easier way of eliminating motion blur than trying to do extra framerate for non-strobed displays. With a strobe backlight, 1/480sec strobing at any single-strobe-supported refresh rate (85Hz, 100Hz, or 120Hz, assuming VSYNC ON frame rate exactly match refresh rate) has about 1/8th the motion blur of 60fps @ 60Hz. (Obviously, for competitive gaming, VSYNC OFF has less lag, but can add microstutter/tearing. VSYNC ON was done for motion blur testing to see if it matched predicted motion blur. The predicted motion blur (via math) versus observed motion blur -- is quite a reliable match when GtG is a tiny fraction of refresh cycle and frame rates exactly match refresh rates.

The Moving Goalposts Of Diminshing Returns

During non-strobe modes (assuming 0ms GtG display), doubling refresh rates & frame rates only halves motion blur.

But there's GtG. If you get close to GtG limits (e.g. 5ms VA on 120Hz), you won't even halve motion blur -- it might be only 40% less motion blur than 60 Hz -- because of GtG overhead adding extra blurring (ghosting) above-and-beyond eye-tracking motion blur.

Okay, you solve it by using a TN monitor. 1ms TN with good overdrive that does it 90%+ accurately for all 256x256 pairs. Okay, good enough. Yep, 120fps@120Hz definitely looks half motion blur of 60fps@60Hz. And 240fps@240Hz has a quarter of motion blur than 120fps@120Hz. The 240Hz refresh time is (1/240sec = 4.16ms) which is still much larger than 1ms, so the mathematical relationship still holds up. GtG doesn't yet "intrude" into the mathematical relationship yet.

Now, you play 60fps at 120Hz. Or play 60fps at 240Hz. No blur difference than 60fps @ 60Hz. Boo. Why? Unfortunately, it's the frame visibility time that matters, if you're repeating identical refresh cycles on a sample-and-hold display. Since they're sample and hold displays (continuous illumination) repeating the same image in multiple refresh cycles is just tantamount to a lower refresh rate.

So you double frame rate to keep up with refresh rate. Okay, now we're talking! Half motion blur. We can keep upping the ante -- 60Hz -> 120Hz -> 240Hz TN LCD -- and we actually can get one-quarter the motion blur of 60fps @ 60Hz.

But increasing frame rate is hard! The GPU can't keep up. What do we do?

How do we shorten frame visibility times (and thus, display motion blur) without raising refresh rates?

Yep. Black periods. As explained before. Whether it be CRT, or black frame insertion, or strobing, etc.

Alas, strobing isn't perfect. It can also create side effects, which is written about further below.

Now, to achieve 4ms MPRT = 4ms of frame visibility time = 4pix blur at 1000pix/sec motion via either:
-- 240fps@240Hz with a non-strobed 240Hz monitor with extremely fast pixel response (tiny fraction of refresh cycle)
-- 60Hz strobing with a 75%:25% OFF:ON strobe duty cycle.
-- 120Hz strobing with a 50%:50% OFF:ON strobe duty cycle. (Note: This needs an incredibly fast-scanout, 1/240sec or better)

Okay, good. But 4 pixels of motion blurring still isn't enough to pass the TestUFO Panning Map Readability test.

Now we want to have briefer frame visibility time. 1ms MPRT. Damn, we're losing brightness with small strobe lengths. Or we need nearly unobtainium panels (1000fps @ 1000Hz) which are only barely available on expensive laboratory displays. (Again, many VR researchers agree now -- Michael Abrash, the Chief Scientist of Oculus, agrees with this).

But is 1ms MPRT enough? Imagine, we have 8K surround vision. Virtual Reality where a slow-head-turn can create 8000 pixels/sec panning. At 60Hz, 8000 pixels/sec would creates (8000/60) = 133.33 pixels of motion blur. But even at 1000Hz, that's still (8000/1000) = 8 pixels of motion blur! Ouch, in this specific use-case, we'll need to break 1000Hz if we want retina-clarity CRT-clarity during slow-head-turn situations -- without using strobing/impulsing -- and avoiding motion blur / phantom array effects (which would make it better than CRT -- since CRTs still have stroboscopic / stepping effect problem caused by finite refresh rates, too).

While 8K is almost retina-quality for 90-degree VR, we need 16K for wide FOVs approaching 180-degrees. Ouchie. Even problems like aliasing (pixel jaggies) -- see www.testufo.com/aliasing-visibility -- may enforce Holodeck-type displays to be 16K displays when stretched to surround-vision.

All those pixels are blown up really, really big when you make it a surround display.


Or higher resolution rendering at eye gaze point, of course -- a different engineering problem also (with problematic input-lag considerations too!) -- which was only solved when NVIDIA went to 16,000Hz in their laboratory for their prototype AR display. Other researchers are also working on eye-tracking displays that display higher resolution at the gaze point -- like Varjo..which you may have seen in the news.

Now, how high a Hz are we talking about to try to solve many problems all at once -- problems caused by the artificial human invention of using static images to represent moving images. Researchers are already hitting the high quad-digit or low quintuple-digit leagues, which experimental laboratory displays already exist for (See subsequent reply).

For now, we have to stick to "good enough" technology with finite refresh rates & limitations.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Big rant about LCD's & 120Hz BS

Post by Chief Blur Buster » 30 Jul 2017, 15:16

anothergol wrote:Less wagon wheel effect is what I imagined. So, wouldn't what I described work?
-graphic card fakes a virtual framerate
-each set of X frames are blended together & output at 60Hz
It works.
Some video processors already do this.
(Remember: I worked on video processing devices, having worked in home theatre).

And, yes, it eliminates the wagon wheel effect.

But you have traded motion blur for eliminating wagon wheel effect.

Blending together creates pre-motion-blurred refresh cycles. If you use a high speed video on this blended technique, you notice that each blended refresh cycle is already blurrier than the original frames themselves. Human eyes cannot "undo" this blending.

You have a pick-your-poison effect: Motion blur or wagonwheel/stroboscopic effects.

Strobing fixes motion blur but makes wagonwheel/stroboscopic effects worse
Blending fixes wagonwheel/stroboscopic effects but makes motion blur worse
Adding artificial GPU-accelerated motion blurring (games) fixes wagonwheel/stroboscopic effects but makes motion blur worse

The bigger the display envelopes your vision, the worse this becomes (tiny screen -> monitor -> IMAX/VR surround) because your eyes are tracking faster for longer periods.

At a certain point (Surround displays), the only way to reduce both simultaneously is to raise the framerate (with enough refresh rate to display each unique frames completely).

Even Michael Abrash of Oculus agrees with me, he talks of 1000fps @ 1000Hz too, see Down The VR Rabbit Hole: Fixing judder
anothergol wrote:But for real: I do not believe that one needs 10k frames per second to properly reproduce the filled, solid shape that a rolling wheel/top produces. Sure, produce 10k frames, output them to a 10kHz monitor, the result should theorically be perfect. But our eyes aren't gonna improve any time soon, and I truly believe that you can blend 10K/70 frames together, output this at 70Hz, and the result will be identical. Until I've seen it, I won't believe in the benefits of 120Hz, other than for an LCD to do pretty much the same as what a CRT does at 60Hz.
It doesn't matter for practical use cases, like smartphones or even desktop gaming monitors.

But it becomes important if we're trying to trick somebody into believing they're in a Holodeck (360-surround + retina graphics + perfect holography). In this situation, we will need framerateless displays or unobtainium framerates to simultaneously solve both wagonwheel and stroboscopic effects.

If you don't believe the "1000Hz malarkey", I behoove you to read:
1. Michael Abrash's Down The VR Rabbit Hole: Fixing judder
2. This educational thread: So What Refresh Rate Do I Need?
3. My reply on Quora. I throw this in, as this is a Coles Notes 101 version of advanced scientific papers.
Once you've read #1, #2, #3, come back here.

(There are many more, but these are probably the most simplified "Cole Notes 101" of them)

Many VR scientists generally agree with me, and I've even had a small side contract with one of them that convinced them to use impulse-driving techniques to fix display motion blur from head-turning...

Anyway, bottom line, to have the same amount of motion blur as a 2ms strobe backlight, without black periods, you need to fill all 2ms slots with unique refreshes, which means (1000ms/2ms) = 500 frames per second at 500 Hertz. So if you want strobing/LightBoost without the flickering/strobing/black periods (and greatly reduced stroboscopic effect, too), that ends up requiring a 500Hz non-strobed display.

CRTs with 1ms (fade to ~90% dim) phosphor would require 1000fps @ 1000Hz to run CRT-clarity in fadeless/decayless/strobeless operation.

Many display manufacturers (who just outsources panels from China and write firmwares) actually do not understand this fully, either, but when people like us try to simulate a Holodeck, we quickly realize the limitations and, sometimes new scientific papers are written about this.

Practically, yes, it is unnecessary to go to unobtainium refreshrates/framerates, even if it's necessary to trick a human into thinking they're in real life if you've secretly implanted them into a holodeck. (Or say, a similiar turing test type situation, "Wow, I didn't know I was wearing a VR headset instead of transparent ski goggles!"). For that sorta stuff, nearly-unobtainium framerates & refreshrates are needed.

Basically, if you move your mouse cursor very fast in a circle, the mouse cursor is not a continuous blur but "steps" stroboscopically. Exactly the same problem. At 120Hz, there's half the distance between mouse cursor steps at the same mouse movement speed. At 240Hz, it's 1/4th as much as at 60Hz. To make the mouse cursor trail continuous blur, you need to (A) add artifical motion blur (which creates motion blur that interferes with CRT clarity in panning motions), or (B) add more frames/refresh cycles. Simulating real world (aka tricking you that a Holodeck is real life) requires (B) to a huge magnitude.

Take a long-exposure photograph of your screen at different refresh rates (CRT or LCD, doesn't matter). Make sure your mouse Hz is higher than display Hz. The higher the refresh rate, the shorter the gaps between mouse cursors is. This is a stroboscopic effect problem.

Again, from this (currently mostly) thought exercise, almost nobody really truly needs 1000fps at 1000Hz for a desktop gaming monitor. (or 10KHz for surround retina Holodeck -- remember, the smaller pixels and the bigger the surround vision is, the higher the framerate/Hz needed to successfully do the "trick into a Holodeck is real life" effect for five-sigma of the world's humans. If your vision is sharper than the average human, it's a problem.

It's just only needed if you're trying to simulate framerateless real world (zero added motion blur AND zero added stroboscopic/stepping/wagonwheel effects) leaving limitations to real human brains and real human eyes, without the display injecting any artificialness (caused by the human invention of trying to use a series of static images to represent analog real life motion). Again, repeating, it pretty much doesn't matter except as a thought exercise in understanding what it will technologically take.

That said, 240fps @ 240Hz (which I actually have sitting on my desk) definitely reduces stroboscopic/wagonwheel effects quite a lot. Practically (combined with a powerful GPU), it's the currently clearest motion we can get from a steady-illumination gaming display -- without using black periods / flicker / impulsing / phosphor.

Again there is a difference between the two questions:
"What needs to happen to make LCD as good as CRT without needing strobing?"
versus
"What needs to happen with a room-scale display to trick a person to thinking he's in a Holodeck?"
Those two answers will have different minimum required refresh rates, but both numbers are rather high (e.g. "Several hundred frames per second" versus "several thousands frames per second") in order to mimic analog motion to both simultaneously eliminate motion blur and stroboscopic effects.

And these are also (different) legitimate questions:
"Do we need to perfectly eliminate motion blur?"
AND
"Do we need to perfectly eliminate stroboscopic effects?"
AND
"Do we need to perfectly eliminate stroboscopic effects AND motion blur simultaneously?"

Sure, the answer is very well probably "No" to at least one or more of the above questions. There IS a such a thing as "Is it Good Enough? :D

But it becomes a massive mountain of a problem when we're trying to perfectly simulate a Holodeck that is capable of tricking five-sigma of population -- then the answer is "Yes" to needing to eliminate both stroboscopic effects AND motion blur simultaneously; we'll need to make sure displays are not injecting limitations that makes it different from analog framerateless motion.

Yes, the law of diminishing returns definitely apply here.

From the goal of a CRT lover, we want our great colors, inky black levels, no LCD artifacts (inversion, strobe crosstalk, GtG ghosting/coronas), and we don't really need to care about flicker or even a little bit of stroboscopic artifacts. So we can tolerate a bit of strobing/flicker that comes with CRTs as its natural way of keeping motion blur away.

Regardless, for now, in real life, we are stuck with strobing as a more practical, realistic means of eliminating motion blur. Just like CRT flicker. Right now, the benchmark for CRT-league zero motion blur on a screen is roughly 1ms MPRT (effective impulse length), whether it be CRT flicker or strobe-backlight flash. Doing this successfully while remaining bright, artifact-free and flicker-free, is extremely difficult.

Remember, before replying to this particular post, please at carefully read Michael Abrash's article first.

Michael Abrash is Chief Scientist of Oculus, and he said (direct quote), "the sweet spot for 1080p at 90 degrees FOV is probably somewhere between 300 and 1000 Hz". Many VR scientists now agree this is the minimum floor (and it's only 1080p, only 90 degrees, not retina). Since then, many smart people have found out that higher numbers being necessary (including the low entry of quintuple digits) for very certain outlier use cases.

And... I've had a small contract with one of the VR manufacturers, too.

Anyway, again, read his writing first, before replying -- Okay? :)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

[Overcoming LCD limitations] Big rant about LCD's & 120Hz BS

Post by Chief Blur Buster » 30 Jul 2017, 16:56

Also, more on the stroboscopic stepping effect -- I refer to the whole the family of ALL "stepping effects" of all types, whether be stroboscopic effects or wagonwheel effects, etc. This is a longtime side effect from the artificial humankind invention of using a series of static images to try to represent continuous analog motion.

Don't forget 1000 Hz displays already exist in the laboratory, and I've seen some of them (e.g. Viewpixx 1440 Hz laboratory DLP, Microsoft Research 1000Hz touchscreen, etc).

A common example stepping artifact is the mouse cursor. I have created a new image to illustrate this. This is visible to human eyes on all finite-refresh-rate displays (CRT or LCD or OLED) at almost any currently-obtainable refresh rates today.

Image

I have tried my best to keep the photos (imperfectly) aligned & mouse cursor movement at the same speed at all refresh rates (within ~10% or so), but the point is clearly illustrated. You have more cursor position samples at higher refresh rates.

As you already pointed out -- blending will work to fix this (but I add a caveat: with the side effect of motion blur). Yes, you can solve this by blending the mouse cursor into artificial motion blur (ala GPU motion blur effect) to fix this stroboscopic effect. But that's blur -- added blur above-and-beyond natural human brain.

A mouse cursor can cross the screen easily in just 1 second. For a 1920x1080 display, that's 1920 pixels per second mouse movement. To eliminate both added motion blur AND stroboscopic effect (CRT clarity without the impulsing/strobing/flicker), you would literally need a razor-sharp individual, unique frame, at every single pixel position. Every 1920 of them.

A sharp frame for every single pixel position, would be needed to make the mouse cursor simultaneously meet both criteria:
(A) During fixed gaze, looking like a continuous blur without stepping/stroboscopic effect. 100% natural blur, no blur added by display, no blur added by GPU.
(B) During eye-tracking, looking at a razor-sharp mouse cursor (CRT clarity during panning).
Only then and then, you've simultaneously solved the stroboscopic effect AND motion blur (via overkill frame rate & refresh rate)

The moment you add blending (As YOU suggest), item (B) automatically fails. We've tried, scientists tried, display engineers tried, VR scientists tried, etc. It just can't be done well enough to allow any instantaneous sample to have a razor-sharp image -- necessary for zero-blur zero-stroboscopic during panning imagery situations.

Now, you use CRTs, so if you look at TestUFO Panning Map Test moving at 1920 pixels per second. You can still read the street name labels in this TestUFO Panning Map Test at that 1920 pixels per second speed, thanks to CRT's simultaneous ability of fast response & short impulses (flickering) -- both of which is necessary to eliminate motion blur.

Now if you stare at a fixed point (screen centre) while TestUFO Panning Map Test scrolls past -- you'll see the usual stroboscopic stepping effect. Same problem!

Same thing happens in FPS games with GPU blur effects disabled -- stare at crosshairs while you flick left/right in high-contrast scenery -- you see the stroboscopic effect too.

Now enable the blending (GPU blur effects), the stroboscopic effect disappears.

But during eye tracking (tracking objects moving across the screen), you see motion blur from the GPU blur effect even on your CRT. Ouch.

Again, this is a pick-your-poison effect.

Now if you do this blending effect (that you described) for panning game scenery that your eyes track on -- many games already does this (GPU blur effect) then you no longer have CRT motion clarity during panning scenery anymore. Aka, the "pick your own poison" effect that I've explained above, in my earlier post above this one.

Trying to simultaneously eliminate motion blur AND eliminate stroboscopic effects, has long been very problematic, and only now, we're realizing how big a mountain this really truly is (aka unobtainium refresh rates as acknowledge by many, even including Michael Abrash of Oculus (formerly iD Software):
Michael Abrash wrote:..."the sweet spot for 1080p at 90 degrees FOV is probably somewhere between 300 and 1000 Hz
And more references, too. Jawdrop as it may be, it is (more or less) true if our goal is eliminating stroboscopic effect AND motion blur simultaneously, as he described.

Now beyond 1080p -- and retina VR -- and 180 degree FOV -- the number actually bumps up rather significantly higher than that, given the nasty defocussing effects (of display motion blur) during head-turning situations. Unfortunately, yes, we have lot more to go in humankind before we successfully reach Holodeck-quality displays (surround + retina + perfect 3D) -- that requires very unobtainium refresh rates.

With the increasing amount of research now available, it has become quite increasingly naïve to parrot the outdated "Humans can't tell 'X' fps apart from 'Y' fps" stuff when the answer is often more complicated than a Yes or No.

...Especially when the display variables are set to insane levels (e.g. future retina-density 180-degree VR displays -- or a Holodeck).

The long answer is it depends on all the variables, and the full continuum of display sizes/resolutions, all the way from "smaller-than-postage-stamp" displays all the way to Holodeck (complete surround, complete retina). The tech plays a big role -- such as LCD GtG limitations, refresh rates, frame rates, impulse (flash) lengths, whether lots of fast, long-distance eye tracking occurs or not (more applicable to bigger/surround displays), etc. And what we're looking for? (motion blur problem? stroboscopic problem?) Or all the above? Etc.

Sure, this isn't exactly that important for gaming or whatever (when we're down to 2ms strobes or CRT phosphor, we're often perfectly happy) -- we naturally ignore the stepping effects. And we have points of diminishing returns, obviously. But can be important when we're trying to simulate a Holodeck where many VR scientists are trying to focus on making a display looking perfectly like real life. Which means eliminating this problem *and* motion blur simultaneously.

Fortunately, for a display that goes into your pocket or desktop, a lot of current tech (e.g. OLED) is good enough. That said, scale up OLED to HDTV sizes covering 30 degrees FOV -- like a big HDTV close to your sofa instead of a small handheld at full arm's length (usually <10 degree FOV). Now you see lots tons of motion blur problems at http://www.testufo.com/photo .... TestUFO Panning Map Readability Tests fails even at only 480 pixels per second on nearly all OLED HDTVs at the moment. And then, it even gets worse for surround field of vision (e.g. VR headsets like Rift). So, out of necessity, all the good VR headsets all use strobing. even Oculus and Rift.

Remember as you turn your head on a VR headset, that forces graphics to pan on the VR screen -- turning your head left and right slowly. Even a slow head turn can mean a VR screen panning speed of several thousand pixels per second. In this situation, the VR headset makers want to target CRT motion clarity during this situation. In case you're tracking your eyes on something while moving your head around. Without strobing -- the situation is it feels like a nasty forced eye-defocussing effect during head turning situations, witnessing scenery go into sudden motion blur -- the blur from panning images on non-strobed OLED/LCD monitors.

Current sweet spot for desktop monitor at common viewing distances and common pixel densities (usually non-retina-densities) is about ~1ms MPRT assuming sufficient brightness (2ms strobe lengths is often preferred due to this). 1ms-2ms MPRT (frame visibility time, not GtG transition time) is currently achieveable with fast LCD (strobe backlight flash) or CRT impulsing (phosphor) or OLED impulsing (rolling scan). But not yet achievable with steady-state illumination / sample-and-hold.

Within our lifetimes, gaming monitors may someday reach ~500fps@500Hz (2ms MPRT = 2ms frame visibilities) or perhaps ~1000fps @ 1000Hz (1ms MPRT = 1ms frame visibilities) to achieve the sweet spot without impulsing/flicker/strobing at all. Diminishing returns? Yes. But still (confirmed) visible to human eye -- there are laboratory 500Hz and 1000Hz displays (e.g. Viewpixx 1440 Hz laboratory DLP, Microsoft Research 1000Hz touchscreen, etc). I have already feasted my eyes on some of these technologies, and I tell you, 240Hz ain't the final frontier. ;)

Oh, and let's not forget NVIDIA's 16,000 Hz augumented display prototype. While this is a monochrome AR prototype, and primarily worked to eliminate lagging, it also successfully simultaneously eliminated motion blur effects and stroboscopic stepping effect. Further reconfirming what I have written in this thread, is truly correct, too. NVIDIA truly understands what I understand too! (and vice versa).

Yes, LCDs are crap at many things. But "Humans can't tell X (fps|Hz) from Y (fps|Hz)" is flat earth thinking in the world of VR/AR. The answer isn't an easy "Yes" or "No".

Unfortunately, a lot of the smartest people are finding out we truly need those quadruple/quintuple digits (refresh rates & frame rates) to be good enough to match real life (analog framerateless motion). This is for certain extreme use cases such as the long journey of progress trying to achieve the Star Trek Holodeck. (retina + 360° + holographic, where absolutely nothing looks "off").

Yes, we need OLEDs to get bigger and better.

If one is perfectly fine with strobing/flicker/impulsing/CRT, and don't mind a little bit of stroboscopic effect, then one can simply be happy with impulse-driving. Ala CRT, / rolling-scan of Oculus Rift / HTC Vive / or the strobe backlights of a modern gaming monitor. Assuming you're running a game capable of running at a framerate matching the monitor's supported strobe frequencies (which may not include 60Hz).

TL;DR: The pick-your-poison effect created by finite refresh rates -- either living with stroboscopic effect _OR_ putting up with added motion blur -- exists because of the humankind invention of using a series of static images to represent analog, continuous motion. Whether this "matters or not" is whether or not it is necessary to eliminate both stroboscopic effects AND motion blur. Often, it's not important...unless we're trying to create the perfect surround retina 3D Holodeck display...

EDIT: Due to the advanced display technology replies I've written, I've moved this thread to the "Area 51: Display Science, Research & Engineering" forum area, due to the advanced topic matter being discussed.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply