I'm afraid it's one step forward, two steps back nowadays.
Respectfully, I have to disagree with your optimistic perspective.
You, Chief, may be the only widely recognizable and respected force behind the
"no blur in games" movement
I suggest you (Chief) stop reading this post here, as I don't want to waste your time. I'll elaborate what I mean. (others, feel warned, bad writing ahead
and also, while creating this thread is somewhat useful, as it brings some info useful for people interested in blur-free gaming, the explanation to why I am disappointed about the current state of affais, may rightfully so feel like waste of forum space. )
------------------------------------------------------------------------------------------------------------------------------------------
Point of reference
I'm gaming since the 80s.
The motion clarify was better in the 80s and 90s than it is now. Of course, I'm not talking about 3D games.
I remember that in 1998 I already tweaked whatever I could to achieve the perfect motion clarity. Even before that. In 1999 I already enjoyed some games in 85fps at 85Hz.
First strobed LCD was released around 2009-2010. More affordable Benqs were released a decade ago. In 2012.
I thought 10 years in "dark age" of blurry LCDs was too long and I was so happy it's over.
But now it's over 2 decades and we're still nowhere near where we were 20-25 years ago.
Back to the early 2010s.
At this time, things were indeed moving two steps forward and one back. In 2013-2014, VR looked like the savior, the game changer - once VR tech guys realized VR cannot work well without low persistence.
TVs started getting 120Hz.
Sony TVs were receiving usable low persistence modes for 60Hz.
But it kept being just a small niche which didn't get mass appeal.
It didn't get traction, doesn't grow in popularity.
The reality now
What constantly IS getting the appeal is pushing the gaming and display industry in the opposite direction, actually.
-
g-sync/free-sync - it not only is praised in monitor reviews contrary to low persistence modes, which are usually frowned upon because of bad implementation and crosstalk/ghosting issues, or too dim/dark image and lack of HDR, but also usually means disabling the low persistence mode.
- TN monitors are dead in gaming segment. These are the fastest LCD panels, easiest to get usable strobed modes on, and you cannot buy any cheap and good TN 1440p or 4K 120Hz monitor now. Even if IPS can catch up at 120Hz, the edge TNs have over IPS still could be very useful now, when we finally have Displayport 2.0 and HDMI 2.1 capable of driving 480Hz modes at usable resolutions. There are IPSes, which increases the price, the gsync features raise the price even further. VAs are way too slow for 120Hz or higher and these are the most popular and most praised gaming displays now, cause reviewers are usually fans of HDR and high contrast.
- TV situation is really bad. I'd even use the word "dire". Models supporting 120Hz input signal are still outside the cheap (mainstreadm) price segments, but that's just the start of the problems. Again, VA panel type poses an issue. OLEDs are good but reviewers usually say "meh.. it offers better motion clarity, but the image is too dark and ruins HDR". Then there is the issue with not only new LG OLED panels, with the two issues I mentioned earlier, but looks like (correct me if I'm wrong, I might be) there is also the problem of new type of QD-LCDs, which seem to be bad at motion handling. We see the push for HDR - again, just like gsync before, its popularity and importance among reviewers becomes an enemy to low persistence popularization. LG is going backwards this year. There is no chance for MicroLED TVs at affordable price price within next 5 years. MicroLED tech would be perfect as it can reach high brightness levels with good response time. There are no cheap LCD or OLED TVs for gamers. If you want a good TV for gaming, you'll pay for the mini-computer (so called smart features) built in, you'll pay for sleek look/design, you'll pay for HDR, for advanced software for image processing, you'll pay for wifi chips, you'll pay for active cooling (again: cause people expect nice and slim TVs now). All this does not help the situation. You pay a lot and your ability to utilize the feature is drastically limited anyway. That's now the path toward popularity.
- the consoles. Both Microsoft and Sony are focused on VRR (variable refresh rate). They advertise it, it gets reviewer's attention. None of the big three ever mentioned low persitence. Even when Sony and MS finally achieved games with constant 120Hz. The first real opportunity to introduce gamers to the flicker-free low persistence modes. Nintendo Switch has a small screen. The smaller the screen, the less annoying the flicker is. So Nintendo Switch could've received strobed mode, perfect for past-gen, emulated games and many 60fps Nintendo games, like Mario Odysee. Currently PS5's firmware does NOT allow disabling Automatic Low Latency mode, which disables low persistence modes on most TVs. Let's not even mention the lack of 100Hz mode (or 90Hz!) which is really annoying as many games run at 90+ or 105+ framerates, so could be easily locked for low persistence. It shows how unimportant it is for the "big guys" in the industry.
- the biggest issue: software side. There is not only the plaque of motion blur and temporal antialiasing solutions (which I personally absolutely cannot stand, as it reminds me the old days of overused VHS tapes in the 90s) which ruins the image clarity by a lot by itself, but there are worse things now. Reviewiers, even respected ones, praise the new AI-based anti-aliasing technologies, which basically move the computing power from moving image towards static image. The DLSS, the FSR2. These are absolutely awful. And just like TAA, it will reach the cancer state so advanced, there won't even be toggle options in PC games to disable it. FSR2 is absolutely awful in motion. DLSS can be tolerable or ghost by a lot, depending on a game and implementation. I'll say it may be useful, but nothing and no one will convince me FSR2 is anything but disaster. Look at the image quality in motion. No thank you, I'd rather lower the res and play with jaggies. We played games in 320x240, we can play without AA and we won't die cause of jaggies.
The new Unreal Engine temporal resolve solution is also both computing heavy and bad,
really bad in fast motion. And it looks like the games with those AI-enchanced TAA solutions will be used and locked in 90% of the AAA games released in the upcoming years. EU5.0 is set to become the game engine everyone uses. Seriously, it looks like we're entering absolute monopoly ahead. Unprecedented, scary scenario, but very likely. There are a few other engines which may still exist, but just like Unity, those are not the engines which shy away from those temporal reconstruction solutions either.
Then we have VRS. Variable Rate Shading. That's another technology which takes the image quality from motion and moves it towards static image quality.
The trend in gaming is terms of lighting: Realistic lighting is dominating. Barely anything else is left. And realism means less contrast, more washed out colors, more "dirty" look of things. Less detail visible in shadowed areas. Especially in mid and far distance. This again, works better on blurry displays. It's simply less jarring to see the image becoming a blurry mess if it remains consistent. A crisp image with vivid colors, which in motion is turned into a mushy "soup" is more noticeable. It also is easier to hide the jaggies if there's less contrast. This does infuence the game creators when they choose the setting and visual style for their games. There are many AAA games on consoles which don't even allow disabling motion blur. Even if reviewers say they do (they're blind, I checked at "blur: 0" in one of the PS5 games) and this disease is coming to PCs too.
Worse yet, we have ray-tracing coming in. Alongside UE5.0 it will mean the push towards 30fps games, as the current consoles are far too slow for 60fps with complex geometry lit with rich ray-tracing feature set. Even 3090 cannot go above 40fps in something as simple as Minecraft at 4K, if you enable every RT feature there is. Cyberpunk doesn't even reach 30fps. On RTX3090. The Matrix EU5.0 demo brings the real concerns about PC CPUs handling the EU5 games at 60fps, not to mention more (I will play at 60Hz if I have to, but come on, we all know how eye soaring LCD strobing is at 60Hz. I'd say it's like 50Hz CRT, absolutely unacceptable for PC experience IMHO).
To summarize: Things are not looking good and I'd argue we are at the "one step forward, two steps back" stage considering 2022 and the near-term future.
It's not the end of the world. If you have time, it's OK. You can just live with what we have and wait. If not in 2030s, then surely in 2040s, we'll eventually come out of these dark ages. MicroLED TVs and VR will go mainstream one day. UHFR displays the Chief wrote the awesome article about, will eventually happen, so we don't need to suffer the loss of brightness at 500 or 1000fps. But if you, like me, have just a few more years to game, you may struggle to refrain from being a bit sad and disappointed about the state of things we see in 2022.