rasmas wrote:I had no idea strobe could work fine at 60Hz. I had read that at 60Hz, although possible, it would look way worse than on any CRT, and only at +100FPS it was worthy-similar to CRTs.
The chief difference is that 60Hz strobe looks more flickery than 60Hz CRT because:
- It's a global strobe (all screen flashed at once, rather than CRT illuminating part of screen at a time in a scanning fashion)
- It's a squarewave strobe (no gentle phosphor decay curve)
That's why strobing looks better at a higher Hz. However, the "zero motion blur effect" is good, and the "stutter" appearance is the same.
It just simply merely flickers a lot more. 60Hz strobe = more flickery than 60Hz CRT. That's it.
You must compare apples vs apples though. 100Hz stutter versus 60Hz stutter, regardless of whether you do CRT or strobed LCD. Many people compare 60Hz CRT versus 120Hz LightBoost, but the stutter mechanics are totally different, much like comparing 60Hz CRT versus 120Hz CRT. 60fps@60Hz looks smoother than 80fps@120Hz on a CRT, and the same happens when comparing two strobed LCDs. Locking the frame rate to refresh rate often looks better on a CRT.
rasmas wrote:And also (not sure about this), with big changes of framerate (100-30) LCDs strobing work worse than CRTs (double-strobing?).
No, the stutter mechanics looks quite similar on CRT and LCDs. For the same framerate fluctuation onto the same screen.
The same framerate fluctuations show the same amplified-stutter effect of lack of motion blur.
It's simply just that LCD motion blur hide stutter. So that's a double-edged sword. A pro & con. If you hated stutter but didn't mind motion blur, then non-strobed LCDs were so much better for the same stutter-fluctuation. However, if you hated motion blur, then that's a big problem.
rasmas wrote:Anyway, so there is no newer technology that will have better blur?(at least better than LCDs). Although i "know" about OLED blur, i thought OLED will make it much better than LCDs; and with ULED XD, MicroLED,... i thought that maybe these will have less blur without the OLED problems. Maybe better with Black Frame Insertion, but a BFI made better than with LCDs.
BFI can be better on an OLED if it's a rolling-window scan just like on a CRT.
rasmas wrote:So, in short, there is no point on waiting for newer technologies, as none will get same "blurless" image than CRTs or Plasma can get, right?
In some of the better displays, strobing has improved to the point where you'll need to wait another 5 years plus for any big jump. The first OLED strobe will probably cost a lot (I would not be surprised for >$1000-$2000 for the first strobed OLED gaming monitor, it'd probably be priced similar to the 4K 144Hz locally dimmed displays) -- so easily, you'd be waiting many years.
If you're currently on a 60Hz display, whatever you will be getting will be a big upgrade.
A good 240Hz already reduces motion blur by 75% without needing strobing, as 240fps@240Hz can have one-quarter the motion blur of the fastest, possible 0ms 60Hz nonstrobed display (whether that be OLED, LCD, or whatever). So if you want flickerfree blur reduction without interpolation, then your best way is 240Hz. And might as well get GSYNC/FreeSync while you're at it too.
rasmas wrote:And with BFI will all behave equally, or any will be better?
It's all in the curve, photodiode oscilloscope curve -- whether it's squarewave or whatever. Comparing squarewave versus squarewave, BFI looks identical on all displays regardless of display technology. But there's millions of ways to do BFI. You can do fuzzy-edge rolling-scan BFI. You can do global BFI. You can do a fade-BFI. You can do a squarewave-BFI. The shape of the BFI curve on a light-flicker graph, is the key.
But for identical curve-versus-curve, it looks the same blur no matter what the display tech is (LCD, OLED, whatever).
The most mathematically simple one is the squarewave BFI.
For squarewave (on/off BFI), the motion blur mathematics is quite pure and simple:
50%:50% ON:OFF BFI = reduces motion blur by 50%
25%:75% ON:OFF BFI = reduces motion blur by 75%
10%:90% ON:OFF BFI = reduces motion blur by 90%
Etc.
This is identical on all displays, regardless of tech.
Now, not all displays behave BFI in the same way -- for example phosphor doesn't turn off instantly. It fades. The fade effect softens things a bit but can also make things less harsh on the yes (less flicker appearance) so it's a pro/con. The math is a lot more complicated with curvy BFI like that. But needless to say, how much BFI reduces motion blur is all in the curve. But if it's resembling squarewave (like strobing or rolling-scan OLED pulsing), then the motion blur math tends to be relatively simple.
A ULMB strobe backlight is roughly equivalent to a ~90% BFI (10% ON, 90% OFF)
That said, 50% BFI reduces motion blur by 50%. So you can either use 120Hz+ "50% BFI" or you can use 240Hz (no BFI) to get pretty much exactly the same motion blur. Double the Hz (and frame rate) can halve motion blur on a non-BFI display. So why only 50% BFI if you can also reduce motion blur strobelessly too?
And also, keep in mind if a BFI is designed to do a curve that mimics phosphor decay, it would also look like phosphor decay too. That said, it's challenging because most good blur-reduction modes use global-strobe backlights/edgelights, which don't rollingscan like a CRT does, so it can't easily be made identical unless it is engineered into a FALD design, which has its own engineering complications/issues too.
Now, how this boils down to deciding waiting or buying now, I can confirm there will not be dramatic improvements in the next 5 years so if you're waiting, there is no need to wait much more. On the other hand if you have specific needs like "I want 120Hz plus about 50% BFI in an OLED", then LG may be releasing such a panel in 2020.
It depends on what you're looking for. Console gaming. PC gaming. Television. Desktop monitor. How much motion blur reduction you really want. Etc.