Yup.anothergol wrote:I see smooth motion, no strobing, and at this speed I can still perfectly identify all detail.
That's the virtue of CRT, and currently, the stroboscopic stepping effect isn't important for your particular games or use case. And when playing 1980s games.
And that's the way they were back then (stroboscopic stepping effect if you do a fixed-gaze during a scrolling platformer or shoot-em-up). The stepping effect doesn't detract from these types of games or gameplay, and you haven't complained about, so it's no problem.
Now, also there's another use case: People who want a single "compromise" display that prioritizes on good coding/docs/email work but also is capable of high-Hz or strobing for their particular gaming (e.g. FPS or emulators or whatever). Some people like that here exist.
And, also, there's yet another use case: People ware much more bothered by flicker than you are, or bothered by strobscopic stepping effects, or bothered by motion blur. There are people around here that get strain from stroboscopic stepping effects, even at 120Hz or 240Hz. And there are people around here that prefer motion blur over CRT flicker (more sensitive than currently available CRT refresh rates, even to 120Hz). There's also people who even get eyestrain from display motion blur (requiring them to use CRT or strobed LCD or other short-persistence display, or very-high-Hz LCD, to get relief).
Also, a common cause of headaches/nausea on huge-FOV screens (sitting in front row of cinema, or wearing VR headset) is the motion blur of long-duration frames. This was a big problem with non-impulsed VR headsets, surrounding your entire FOV with display motion blur (even with OLED) -- that's why Galaxy smartphones now have a "GearVR Low Persistence Mode" to solve the problem; to flicker the OLED display like a CRT.
From this angle: What's the point?
More Hz isn't BS as your topic line claims.
More Hz certainly isn't BS for 100% of use cases for 100% of population.
Certainly, the 120Hz monitor you tried didn't help you a whole lot, the non-strobed mode doesn't improve blur much, and its strobed mode had too many disadvantages (and wasn't 60Hz/classics-friendly). Doesn't mean 120Hz or 1000Hz is BS.
Usually does not matter when playing most past games, and many current games optimized for 60fps. And certainly doesn't matter for things like coding/email/Word/etc. Even doesn't matter on many of FPS games today when using only midrange GPUs. And doesn't matter if you're happy with flicker/stepping effects (not everyone is).anothergol wrote:I don't think that 100, 200 or 300Hz are really an improvement as for the result.
But your statement isn't blanket for all possible use-cases (even many use cases that now exists today).
Remember, we're a (rapidly growing) niche website that's becoming more important for gaming-of-future issues.
You cannot claim your statement is blanket true, for everybody (even just two-sigma or three-sigma of world population, as many tests have shown), for every possible outlier use case. However, the outlier cases have become so common during VR that it's become necessary to do those workarounds (90Hz strobing). Motion blur from refresh rate limitations, is an increasingly common for future use-cases.
Certainly, your statement is certainly true for many current and past use cases.
But there are indeed already situations on my desktop where the 240Hz LCD monitor (TN disadvantages nonwithstanding) provides really plainly clear advantages in gaming. Like reduced stroboscopic effect during fast turning during fixed gaze at crosshairs during during 300+fps CS:GO gameplay, and I'm no eSports player myself at all -- it really is easier to aim that little bit faster. Even browser smooth scrolling (arrow keys) almost approaches CRT quality with roughly 1/4th motion blur of a 60Hz LCD, although it's not as clear as CRT, but hey -- I do code/browse/etc on my monitor so I consider it a bonus side effect secondary to the benefits to games. And on some strobed monitors I've tested, CRT-quality smooth scrolling of text in web browsers with no text ghosting/crosstalk. (Unfortunately, that's rare -- there's almost always strobe crosstalk on many displays). For web browser keyboard smooth-scrolling becoming CRT quality, all the LCD text advantages, with all the CRT advantages, is something found only on ~25% of the strobed monitors in the Official List of Best Gaming Monitors, but often there's color/brightness tradeoffs occuring at the same time. Ouch. (Hurry up, 240Hz OLEDs )
Now when wearing Oculus Rift or HTC Vive, display refresh rates limitations become much more apparent if you turn off the low-persistence mode. Also when sitting in front row of an IMAX screen -- movie panning scenes is very motion-blurry and nauseates many people; same problem in VR. It's still a big problem at 100/200/300Hz whenever not strobing. Not YOUR "use case", I know. But motion blur also often headaches when playing on big-screen LCD displays on desktop. CRTs often don't have this problem because we don't put 27" or 30" retina-quality CRTs on our desktop, like we can with 27" and 30" LCDs. Motion blur from refresh rate limitations of a non-strobed display (whether be an LCD or OLED in non-strobe mode), is more problematic for big-FOV-coverage situations, like sitting in the front row of a movie theater. Sure, it might not bother one-sigma worth of population, but most people are bothered by the motion blur from sitting in the front-row of a movie theater (both from motion blur built into the film, and motion blur caused by tracking on panning scenery -- they are always additive onto each other, never subtractive). Displays are getting bigger, people are gaming on bigger displays at higher resolutions nowadays, and in more types use cases that makes those matters. In those cases, the "100/200/300Hz" limitations become more visible to more people.
Fixed it for you.anothergol wrote:If we all do agree that OLEDs/LCDs are still not there, that a CRT is still better for my own gaming, well there's no real argument anymore.
Gaming 20 years from now won't be the same as gaming today.
When you can get those $50 Apple-Oakley(tm) VR Sunglasses at your Apple Store (with a self contained supercomputer discreetly inside the tiny frame), that might be the mainstream gaming of 2040s/2050s like like arcades were mainstream to 1980s.
Or if you care about input lag in certain types of games. Even as an amateur online FPS player. First-person shooters, playing today on TN 144Hz or 240Hz monitors that makes a real difference over 60Hz. Almost anything that real-time scans out at 240Hz (from CRT, from TN LCD, or from OLED), despite some of their imperfections, can still actually produce reaction time advantages even over a 60Hz CRTs by virtue of that. Simultaneous-draw situations (go around corner, suddenly see each other, react at same time, shoot each other at same time) with champion players is like crossing an Olympics finish line -- a few milliseconds can matter even if you can't feel the millisecond. The reaction time spread of good players can be so tight together, that an advantage of a few milliseconds (caused by higher refresh frequencies) can make you ahead of your competitors, too. Also, noteworthy is that many of those players use fixed-gaze (stare at crosshairs) so some of them don't care about eye-tracking motion blur (and thus, are mostly immune to tracking-based motion blur on sample-and-hold displays, as long as GtG pixel response is fast). However, not everyone play this way, obviously. That said, the billion-dollar eSports industry is rapidly growing under our noses, with million-follower and hundred-thousand-follower eSports players on Twitter, and full stadiums/venues are now occuring (multi-thousand spectators attending in person, watching champion players playing video games) in some of those 7-figure prize-pot leagues. In these leagues, the extra Hz is a lot more important than for the office worker. Already, today in many countries including the USA (Note: Blur Busters is located in Canada) there are more online viewers of eSports (via Twitch/etc) than of certain real sports (e.g. car racing, ski racing, etc) via television/streaming, to the jawdrop of the previous generation...
Or if you're needing zero-lag augumented reality where overlaid objects do not self-blur (blurrier than the background they're superimposed on top) during head turning situations. Or stroboscopically step, or lag behind. NVIDIA found out it was truly worthwhile to go roughly 16,000 Hz (video) to eliminate human-visible augumented-reality artifacts such as lag-behinds, defocus-effect (caused by motion blurring of limited Hz), or stroboscopic stepping (caused by non-preblended/blurred frames). There were AR sync imperfections even at 8,000 Hz... Lower Hz means more lagbehind when overlaying graphics on top of real world, and/or more tracking motion blur, and/or more stroboscopic stepping effects. You need all of that (all 3 solved simultaneously: lagbehind, blurring & stroboscopic stepping) to all simultaneously be solved for your human eye's maximum accurate eye tracking speed. In this situation, quintuple-digit refresh rates were actually necessary in these kinds of outlier edge cases. Many experiments at many laboratories (including other than NVIDIA) tested at many different refresh rates, many lower, and the imperfections progressively diminished visibly, but still weren't always consistently gone at quadruple-digit Hz for all possible use cases. Even at 16,000Hz, there was some imperfections (e.g. tracking overshoots) as seen in the video, but that part were caused by other limitations (e.g. inaccuracy of tracking detection, etc). Knocking-off all the weak links, obviously, is much more difficult at such insane Hz, though!
Everyone plays different kinds of games, and within games, they play that particular game differently (e.g. eye tracking gameplay tactics versus stationary-gaze gameplay tactics). And tomorrow's games (or next decade's games, or next century's games) won't have the same demands on displays then as they do today. And gaming at higher resolution+bigger FOV than yesterday. Etc. And yes, imagine the imaginary theoretical Apple-Oakeley teamup for VR sunglasses of 2040s, or whatever becomes the mainstream method of gaming in the 2040s/2050s, as one theoretical example. None of us can predict which direction gaming will go in.
Ten years ago, few of us expected smartphone gaming to exceed console/portable gaming. Who's to say that VR sunglasses won't be popular at Apple Store and Microsoft Store or Best Buy/Amazon for our grandkids, or some other theoretical example? And, even today, 240Hz already makes a huge difference to some of us today on our desktops in certain games.
Today's smartphones are bigger than yesterday's portable game screens. (e.g. 6" phones versus 2" Gameboys)
Today's TV screens are bigger than yesterday's TV screens. (e.g. 50" HDTVs versus 27" CRT televisions)
Today's monitors are bigger than yesterday's monitors. (e.g. 27" LCD monitors versus 19" CRT monitors)
And we've of course, got resolution progress (to Retina), more FOV too (because displays are sharp enough to sit closer to), and immense progress being made, Moore's Law style, on virtual reality displays (Oculus, Vive, GearVR, etc).
Remember that artifacts (stroboscopic/blurring effects) can cause some people to puke or feel sick when it's done at large FOV (e.g. front row seat in cinema, or Geforce Titan/Radeon Vega on 27-30" desktop monitor, or when wearing Rift/Oculus). Big whoop on 17" CRT playing MAME. Annoying (and sometimes barf-mania for some) with bigger displays. More people now sit closer to TVs -- or bigger TVs at same viewing distance -- bigger FOV either way. This is because while getting bigger, TVs have also gotten sharper (HDTV...) but the motion blur of LCD/OLED HDTVs also simultaneously forced many sports camera operators to try to track objects (e.g. balls, pucks, etc) more accurately, to prevent nauseating motion blur on big-screen HDTVs -- there are some people who get dizzy with big HDTVs with fast-scrolling scenery (partly due to tracking motion blur...)
Even if we cannot predict tomorrow's gaming habits or even next century -- all of these are factors that affects the benefits of doing more Hz and/or sticking to strobing (as a workaround to avoid needing more fps & Hz).
Now, to requote a sentence (with some impromptu fixes):
TL;DR: You cannot predict what type of video games tomorrow's kids will be playing 10 or 100 years from now where Hz can matter a great deal more. And even today, it's already starting to matter. Thus, your sentence has been 'fixed' to reflect this.anothergol wrote:If we all do agree that OLEDs/LCDs are still not there, that a CRT is still better for my own gaming, well there's no real argument anymore.
Also, see my Part 1 if you haven't read it just yet. I'm also re-quoting Part 1 TL;DR for completeness' sake, with an added emphasis at the bottom:
Chief Blur Buster wrote:Part 1 TL;DR:
- Your maximum eye-tracking speed definitely affects the ideal maximum refresh rate to jump past "diminishing points of returns". However, display angular resolution relative to your eyes (and tracking time available) matters a great deal in determining the "perfect final frontier refresh rate where diminishing points of returns disappear".
- Size/FOV/resolution matters a lot.
e.g. Smartphone -> small monitor -> big monitor -> IMAX or wall height TV -> narrow FOV VR -> wide FOV VR -> Holodeck.
- The bigger/sharper/wider FOV, the more time for tracking over more pixels across the display plane, the easier it is to notice imperfections such as blur (tracking across display plane) or stroboscopic stepping effects (gaze in fixed point of display plane).
- You cannot control whether a human decides to track or gaze. Thus, doing only blending isn't always a fix-all.
- This all doesn't matter if you're just doing email, reading news, editing source code, or writing docs.
- Heck, this might not even matter for you, on displays you play on, running specific games (& gameplay tactics) you play today