TVs. Strobe adjustment settings likely removed on OLED LG panel's level

High Hz on OLED produce excellent strobeless motion blur reduction with fast GtG pixel response. It is easier to tell apart 60Hz vs 120Hz vs 240Hz on OLED than LCD, and more visible to mainstream. Includes WOLED and QD-OLED displays.
Post Reply
RonsonPL
Posts: 122
Joined: 26 Aug 2014, 07:12

TVs. Strobe adjustment settings likely removed on OLED LG panel's level

Post by RonsonPL » 21 May 2022, 06:08

This is rather bad news for low persistence display mode popularization.
(Pardon for the incorrect name here. It's "black frame insertion" but this and "low persistence" were too long for the title)

https://youtu.be/b0BlTdUsyOI?t=256

Panasonic's new line of OLED TVs also has only ON/OFF toggle. Just like with LG OLEDs, the proper menu with various adjustments is gone in this year's line.

Really annoying to see the display industry going the wrong way on clear motion solutions.
Even more so, cause the new panels improve brightness. Exactly what's needed for good strobing modes (currently OLED TVs are too dim to be used in low persistence modes in games in a room with lots of sun light and the contrast and HDR are far from perfect even in a dark room)

There's also some bad news coming from Digital Foundry.
They say there's a problem with Automatic Low Latency mode settings on PS5, blocking the usage of low persistence modes on some TVs completely, or forcing users to come up with work arounds to turn this feature on.
There's also the known problem of removing low persistence from 120Hz (so only 60Hz) modes on newest LG OLEDs. Let's hope this is not hard coded into panel too.
We're losing the battle, guys :/

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: TVs. Strobe adjustment settings likely removed on OLED LG panel's level

Post by Chief Blur Buster » 21 May 2022, 13:35

RonsonPL wrote:
21 May 2022, 06:08
This is rather bad news for low persistence display mode popularization.
(Pardon for the incorrect name here. It's "black frame insertion" but this and "low persistence" were too long for the title)

https://youtu.be/b0BlTdUsyOI?t=256

Panasonic's new line of OLED TVs also has only ON/OFF toggle. Just like with LG OLEDs, the proper menu with various adjustments is gone in this year's line.

Really annoying to see the display industry going the wrong way on clear motion solutions.
Even more so, cause the new panels improve brightness. Exactly what's needed for good strobing modes (currently OLED TVs are too dim to be used in low persistence modes in games in a room with lots of sun light and the contrast and HDR are far from perfect even in a dark room)

There's also some bad news coming from Digital Foundry.
They say there's a problem with Automatic Low Latency mode settings on PS5, blocking the usage of low persistence modes on some TVs completely, or forcing users to come up with work arounds to turn this feature on.
There's also the known problem of removing low persistence from 120Hz (so only 60Hz) modes on newest LG OLEDs. Let's hope this is not hard coded into panel too.
We're losing the battle, guys :/
Often, this is a two-step-forward, one-step backward thing.

Historically we didn't even have any form of proper BFI on most LCDs at 60Hz. Now we have BFI on most OLEDs at 60Hz.

There will be some regressions here and there, but we are still ahead than before. I will also do my best to keep working on regressions with advocacy behind the scenes.

Also, the increase in refresh rates will also eventually make it easier again (e.g. 240Hz will easily allow persistence increments in 1/240sec at 60Hz)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

RonsonPL
Posts: 122
Joined: 26 Aug 2014, 07:12

Re: TVs. Strobe adjustment settings likely removed on OLED LG panel's level

Post by RonsonPL » 21 May 2022, 15:12

I'm afraid it's one step forward, two steps back nowadays.
Respectfully, I have to disagree with your optimistic perspective.

You, Chief, may be the only widely recognizable and respected force behind the "no blur in games" movement ;)

I suggest you (Chief) stop reading this post here, as I don't want to waste your time. I'll elaborate what I mean. (others, feel warned, bad writing ahead ;) and also, while creating this thread is somewhat useful, as it brings some info useful for people interested in blur-free gaming, the explanation to why I am disappointed about the current state of affais, may rightfully so feel like waste of forum space. )
------------------------------------------------------------------------------------------------------------------------------------------
Point of reference
I'm gaming since the 80s.
The motion clarify was better in the 80s and 90s than it is now. Of course, I'm not talking about 3D games.
I remember that in 1998 I already tweaked whatever I could to achieve the perfect motion clarity. Even before that. In 1999 I already enjoyed some games in 85fps at 85Hz.

First strobed LCD was released around 2009-2010. More affordable Benqs were released a decade ago. In 2012.
I thought 10 years in "dark age" of blurry LCDs was too long and I was so happy it's over.
But now it's over 2 decades and we're still nowhere near where we were 20-25 years ago.

Back to the early 2010s.
At this time, things were indeed moving two steps forward and one back. In 2013-2014, VR looked like the savior, the game changer - once VR tech guys realized VR cannot work well without low persistence.
TVs started getting 120Hz.
Sony TVs were receiving usable low persistence modes for 60Hz.

But it kept being just a small niche which didn't get mass appeal.
It didn't get traction, doesn't grow in popularity.

The reality now

What constantly IS getting the appeal is pushing the gaming and display industry in the opposite direction, actually.

- g-sync/free-sync - it not only is praised in monitor reviews contrary to low persistence modes, which are usually frowned upon because of bad implementation and crosstalk/ghosting issues, or too dim/dark image and lack of HDR, but also usually means disabling the low persistence mode.

- TN monitors are dead in gaming segment. These are the fastest LCD panels, easiest to get usable strobed modes on, and you cannot buy any cheap and good TN 1440p or 4K 120Hz monitor now. Even if IPS can catch up at 120Hz, the edge TNs have over IPS still could be very useful now, when we finally have Displayport 2.0 and HDMI 2.1 capable of driving 480Hz modes at usable resolutions. There are IPSes, which increases the price, the gsync features raise the price even further. VAs are way too slow for 120Hz or higher and these are the most popular and most praised gaming displays now, cause reviewers are usually fans of HDR and high contrast.

- TV situation is really bad. I'd even use the word "dire". Models supporting 120Hz input signal are still outside the cheap (mainstreadm) price segments, but that's just the start of the problems. Again, VA panel type poses an issue. OLEDs are good but reviewers usually say "meh.. it offers better motion clarity, but the image is too dark and ruins HDR". Then there is the issue with not only new LG OLED panels, with the two issues I mentioned earlier, but looks like (correct me if I'm wrong, I might be) there is also the problem of new type of QD-LCDs, which seem to be bad at motion handling. We see the push for HDR - again, just like gsync before, its popularity and importance among reviewers becomes an enemy to low persistence popularization. LG is going backwards this year. There is no chance for MicroLED TVs at affordable price price within next 5 years. MicroLED tech would be perfect as it can reach high brightness levels with good response time. There are no cheap LCD or OLED TVs for gamers. If you want a good TV for gaming, you'll pay for the mini-computer (so called smart features) built in, you'll pay for sleek look/design, you'll pay for HDR, for advanced software for image processing, you'll pay for wifi chips, you'll pay for active cooling (again: cause people expect nice and slim TVs now). All this does not help the situation. You pay a lot and your ability to utilize the feature is drastically limited anyway. That's now the path toward popularity.

- the consoles. Both Microsoft and Sony are focused on VRR (variable refresh rate). They advertise it, it gets reviewer's attention. None of the big three ever mentioned low persitence. Even when Sony and MS finally achieved games with constant 120Hz. The first real opportunity to introduce gamers to the flicker-free low persistence modes. Nintendo Switch has a small screen. The smaller the screen, the less annoying the flicker is. So Nintendo Switch could've received strobed mode, perfect for past-gen, emulated games and many 60fps Nintendo games, like Mario Odysee. Currently PS5's firmware does NOT allow disabling Automatic Low Latency mode, which disables low persistence modes on most TVs. Let's not even mention the lack of 100Hz mode (or 90Hz!) which is really annoying as many games run at 90+ or 105+ framerates, so could be easily locked for low persistence. It shows how unimportant it is for the "big guys" in the industry.

- the biggest issue: software side. There is not only the plaque of motion blur and temporal antialiasing solutions (which I personally absolutely cannot stand, as it reminds me the old days of overused VHS tapes in the 90s) which ruins the image clarity by a lot by itself, but there are worse things now. Reviewiers, even respected ones, praise the new AI-based anti-aliasing technologies, which basically move the computing power from moving image towards static image. The DLSS, the FSR2. These are absolutely awful. And just like TAA, it will reach the cancer state so advanced, there won't even be toggle options in PC games to disable it. FSR2 is absolutely awful in motion. DLSS can be tolerable or ghost by a lot, depending on a game and implementation. I'll say it may be useful, but nothing and no one will convince me FSR2 is anything but disaster. Look at the image quality in motion. No thank you, I'd rather lower the res and play with jaggies. We played games in 320x240, we can play without AA and we won't die cause of jaggies. ;) The new Unreal Engine temporal resolve solution is also both computing heavy and bad, really bad in fast motion. And it looks like the games with those AI-enchanced TAA solutions will be used and locked in 90% of the AAA games released in the upcoming years. EU5.0 is set to become the game engine everyone uses. Seriously, it looks like we're entering absolute monopoly ahead. Unprecedented, scary scenario, but very likely. There are a few other engines which may still exist, but just like Unity, those are not the engines which shy away from those temporal reconstruction solutions either.
Then we have VRS. Variable Rate Shading. That's another technology which takes the image quality from motion and moves it towards static image quality.
The trend in gaming is terms of lighting: Realistic lighting is dominating. Barely anything else is left. And realism means less contrast, more washed out colors, more "dirty" look of things. Less detail visible in shadowed areas. Especially in mid and far distance. This again, works better on blurry displays. It's simply less jarring to see the image becoming a blurry mess if it remains consistent. A crisp image with vivid colors, which in motion is turned into a mushy "soup" is more noticeable. It also is easier to hide the jaggies if there's less contrast. This does infuence the game creators when they choose the setting and visual style for their games. There are many AAA games on consoles which don't even allow disabling motion blur. Even if reviewers say they do (they're blind, I checked at "blur: 0" in one of the PS5 games) and this disease is coming to PCs too.
Worse yet, we have ray-tracing coming in. Alongside UE5.0 it will mean the push towards 30fps games, as the current consoles are far too slow for 60fps with complex geometry lit with rich ray-tracing feature set. Even 3090 cannot go above 40fps in something as simple as Minecraft at 4K, if you enable every RT feature there is. Cyberpunk doesn't even reach 30fps. On RTX3090. The Matrix EU5.0 demo brings the real concerns about PC CPUs handling the EU5 games at 60fps, not to mention more (I will play at 60Hz if I have to, but come on, we all know how eye soaring LCD strobing is at 60Hz. I'd say it's like 50Hz CRT, absolutely unacceptable for PC experience IMHO).

To summarize: Things are not looking good and I'd argue we are at the "one step forward, two steps back" stage considering 2022 and the near-term future.
It's not the end of the world. If you have time, it's OK. You can just live with what we have and wait. If not in 2030s, then surely in 2040s, we'll eventually come out of these dark ages. MicroLED TVs and VR will go mainstream one day. UHFR displays the Chief wrote the awesome article about, will eventually happen, so we don't need to suffer the loss of brightness at 500 or 1000fps. But if you, like me, have just a few more years to game, you may struggle to refrain from being a bit sad and disappointed about the state of things we see in 2022.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: TVs. Strobe adjustment settings likely removed on OLED LG panel's level

Post by Chief Blur Buster » 21 May 2022, 15:20

Regardless of step count, we still mutually agree it a nonzero steps forward and a nonzero steps backward. :D
There are agreeably way too few choices that mimic a CRT more accurately.

I'm still waiting for my Holy Grail of 1000Hz+ OLED/MicroLED indeed.

Those would be good displays to run retro refresh rates with blur reduction -- e.g. CRT electron beam simulator algorithms on too, for a less flickery 60 Hz experience, complete with phosphor decay and everything.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

thatoneguy
Posts: 181
Joined: 06 Aug 2015, 17:16

Re: TVs. Strobe adjustment settings likely removed on OLED LG panel's level

Post by thatoneguy » 21 May 2022, 17:18

Depressingly realistic take all things considered.

[Moderator's note -- deleted one politically-charged sentence which is not allowed in these politically-neutral forums.]
...devs being too lazy to make their own engines. Fuck hiring actual programmers who can make engines, just license UE5 and call it a day.
99% of games out there just use stock assets.
Gaming has been going downhill since 7th gen/2006 which is around the same time CRTs started being phased out in favor of crappy ass LCD TVs.
Display technology improvement moving as slow as molasses. OLED is another Plasma, LCD is still being milked while being a mediocre technology overall and lord knows when MicroLED will ever come out for the consumer space.
LG and others making OLED take out BFI for God knows what reason, maybe just to spite their consumers.

Like you said I couldn't care less about Ray Tracing.
Most games nowadays look terrible to me visually. I don't care about how many polygons you're pushing if the game looks like it was made by some soulless AI with no artistic skill whatsoever. Just amateurs flipping mobile-tier stock assets all over the place. Almost every game looks the same. They are not even worth pirating.

All of this homogenization of video games reminds me of how Web Browsers went downhill after Presto Opera died in 2013 after years of Google shitting up the web with their terrible browser which most sites coded for instead of actually following W3C standards.
Now the only options we have for Browsers is Firefox and the many awful Google Chrome/Blink/Webkit clones out there because what's the point making engines when Google, Twitter and Facebook effectively killed the Web and Web Standards so they could ensure their monopoly and turn the Internet into the New TV.

The more it seems like things will improve the worse things seem to get. Sigh, I wish I could go back in time.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: TVs. Strobe adjustment settings likely removed on OLED LG panel's level

Post by Chief Blur Buster » 22 May 2022, 00:42

thatoneguy wrote:
21 May 2022, 17:18
All of this homogenization of video games reminds me of how Web Browsers went downhill after Presto Opera died in 2013 after years of Google shitting up the web with their terrible browser which most sites coded for instead of actually following W3C standards.
Although it is already known I use much more diplomatic language when it comes to businesses, I will say I'm very disappointed in Google too, with all the product cancellations. And it even came to a GMAIL discontinuation: of custom family domain names. (They kinda backtracked though). When Google starts touching verboten territory, many geeks get worried.

___

Back to tech:

The good news is that all we need is a 1000fps 1000Hz OLED and we can just write our own highly configurable open-source CRT electron beam simulator to get various kinds of custom scanned/strobed/rolling/etc look and feels at any low refresh rate we want. 1000Hz allows 16-17 digital refresh cycles worth of 1/60sec worth of CRT electron beam simulation executed in a GPU shader or other algorithm, to allow CRT look-and-feel simulation within the error margins of a retro display (e.g. simulating a typical arcade CRT tube). HDR would be a bonus, since it'd provide nit surge headroom for simulating the brightness of a CRT electron beam. 2000fps 2000Hz would be even better.

Playing a 1000fps high speed video of a CRT (wide-brightness-range, not overexposed), in realtime, back to a 1000Hz display, matches the temporal look-and-feel of the original CRT (including phosphor decay, rolling flicker, and virtually zero blurring). So, the objective is to simulate this in software, by using a large number of digital refresh cycles to emulate one CRT refresh cycle, via simulating, say 1/1000sec or 1/2000sec worth of electron beam scanning.

I'd like to see good CRT electron beam simulators written this decade.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

thatoneguy
Posts: 181
Joined: 06 Aug 2015, 17:16

Re: TVs. Strobe adjustment settings likely removed on OLED LG panel's level

Post by thatoneguy » 22 May 2022, 19:19

The whole 1000hz CRT simulation thing(which lord knows when that will happen) doesn't answer OP's point about the software side which he said is the worst part(and he's correct).
TAA is awful and the whole AI DLSS, FSR2 algorithms aren't any better.
The immense push on Ray Tracing is setting games back too. It's becoming harder and harder to get better framerates.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: TVs. Strobe adjustment settings likely removed on OLED LG panel's level

Post by Chief Blur Buster » 23 May 2022, 18:07

thatoneguy wrote:
22 May 2022, 19:19
The whole 1000hz CRT simulation thing(which lord knows when that will happen) doesn't answer OP's point about the software side which he said is the worst part(and he's correct).
TAA is awful and the whole AI DLSS, FSR2 algorithms aren't any better.
The immense push on Ray Tracing is setting games back too. It's becoming harder and harder to get better framerates.
I see it going in two directions concurrently at the moment -- spatial quality and temporal quality.

They are definitely fighting each other, especially in different industries. If you own certain VR headsets, you've already seen massive temporal quality leaps forward.

It happened with Crysis years ago, a few steps back, a few steps forward.

Gramerates are still vastly superior to the Ultima Underworld and System Shock days. Those framerates were a sudden drop from a lot of games prior to that (e.g. platform scrollers at 30-60fps). It's not my first tango [1993]....

There are ways to dramatically accelerate raytracing in the 1000fps 1000Hz future, for 8K 1000fps UE5 -- there's an engineering path now that involves various kind of intermediate frame rate amplification stages (e.g. Dedicated AI-based frame rate amplifiers with 5:1 and 10:1 ratios, combined with tricks like temporally dense raytracing and other algorithms). It was only recently various skunkworks worldwide finally realized the geometric frame rate requirements (60 -> 120 -> 240 -> 480 -> 960 -> 1920) and are finally working on solutions to fighting the diminishing curve of returns, but it (in some cases) may take a decade before we see good 10:1 framerate amplifiers, especially because of the delays in progress caused by distractions (mining/covid/china supply chains/war/etc). But it's being worked on by multiple companies and researchers.

I'm seeing lots of steps backwards and steps forward. Casual-play framerates are temporarily (or should I say temporally, hah) taking somewhat a backseat to raytracing improvements. No disagreement there. But temporal quality isn't going to stay static.

But we're already getting triple-digit RTX frame rates undreamed back in the Ultima Underworld VGA days...

We are going to continue to see a lot of "Crysis moments" and "RTX ray tracing moments" that puts some valleys in the framerate increase progress here and there.

CPU rendering, then first 3D accelerators (Verite, 3dfx, TNT, etc), geometry acceleration (first Geforce256), then shaders (GeForce 3), then ray tracing (RTX 2000 series), etc.

Yes, I omitted a lot of tech improvement stages here, but you get the idea of things that surged and slowed frame rates simultaneously. Oh and software (arrival of Underworld. Arrival of Doom/Quake. Arrival of Half Life 2. Arrival of Crysis. Etc) often set framerates back, until tech progress caught right back up and surpassed yesteryear framerates.

Humankind is not done yet -- more tech is coming that stacks on top of this progress already done --

And dedicated framerate amplifier silicon stages is something that is not yet done, but will be needed to punch the diminishing curve of returns without needing higher sample rates from the game (100fps-200fps feedstock could be sufficient for a good visually-flawless frame rate amplifier silicon, as long as controller Hz input remains high to help fill in extra frames more flawlessly). I had hoped to see faster progress, but what can ya do?

It was not realized by engineers at big companies until recently of the necessary geometric improvement in framerates needed for blurless sample and hold, since humankind benefits of Hz doesn't stop for a very long time, but needs dramatic differences in framerate=Hz (e.g. 240Hz-vs-1000Hz or even 1000Hz-vs-4000Hz instead of just 144Hz-vs-165Hz or 1000Hz-vs-1500Hz) as you go up the diminishing curve of returns towards retina refresh rates. The average user "I can't tell 120fps vs 180fps apart" falls when you show a demo of 60fps-vs-240fps, or even 240fps-vs-1000fps demonstration on research displays in the lab.

Anyway, Blur Busters had a major role at educating the diminishing curve of returns goes very far (4-digit and 5-digit Hz and fps) but such quadruple-digit framerate amplification solutions that are affordable to consumers, may take a decade -- ala 2030s+. But it is coming eventually.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

RonsonPL
Posts: 122
Joined: 26 Aug 2014, 07:12

Re: TVs. Strobe adjustment settings likely removed on OLED LG panel's level

Post by RonsonPL » 06 Jul 2022, 14:29

@Chief Blur Buster

Can you provide your opinion on how implementing backlight strobing on FALD monitors differs from doing that on traditional backlit monitors?
I'm interested if it requires much more work/effort or if it's more expensive for manufacturer to achieve decent results. And then the same question for the scenario where manufacturer aimed at good results.

(CRT scan simulations aside)


Didn't want to create a new topic just for that, so posted here. ¯\_(ツ)_/¯

RonsonPL
Posts: 122
Joined: 26 Aug 2014, 07:12

Re: TVs. Strobe adjustment settings likely removed on OLED LG panel's level

Post by RonsonPL » 07 Jul 2022, 15:36

Good news. Sony just updated the PS5 firmware

I wonder if we'd get that if Digital Foundry didn't spoke so loud about this being wrong and annoying.

Either way. Unless Sony managed fo F up the VRR settings this time, (the ALLM cannot be disabled with VRR enabled), this should improve the situation for backlight strobing/scanning and BFI modes on TVs :)

Post Reply