[Overcoming LCD limitations] Big rant about LCD's & 120Hz BS

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: [Overcoming LCD limitations] Big rant about LCD's & 120H

Post by Chief Blur Buster » 01 Aug 2017, 12:23

(Part 2 of 2. See Part 1 if you missed it.)
anothergol wrote:I see smooth motion, no strobing, and at this speed I can still perfectly identify all detail.
Yup.

That's the virtue of CRT, and currently, the stroboscopic stepping effect isn't important for your particular games or use case. And when playing 1980s games.

And that's the way they were back then (stroboscopic stepping effect if you do a fixed-gaze during a scrolling platformer or shoot-em-up). The stepping effect doesn't detract from these types of games or gameplay, and you haven't complained about, so it's no problem.

Now, also there's another use case: People who want a single "compromise" display that prioritizes on good coding/docs/email work but also is capable of high-Hz or strobing for their particular gaming (e.g. FPS or emulators or whatever). Some people like that here exist.

And, also, there's yet another use case: People ware much more bothered by flicker than you are, or bothered by strobscopic stepping effects, or bothered by motion blur. There are people around here that get strain from stroboscopic stepping effects, even at 120Hz or 240Hz. And there are people around here that prefer motion blur over CRT flicker (more sensitive than currently available CRT refresh rates, even to 120Hz). There's also people who even get eyestrain from display motion blur (requiring them to use CRT or strobed LCD or other short-persistence display, or very-high-Hz LCD, to get relief).

Also, a common cause of headaches/nausea on huge-FOV screens (sitting in front row of cinema, or wearing VR headset) is the motion blur of long-duration frames. This was a big problem with non-impulsed VR headsets, surrounding your entire FOV with display motion blur (even with OLED) -- that's why Galaxy smartphones now have a "GearVR Low Persistence Mode" to solve the problem; to flicker the OLED display like a CRT.

Anyway,

From this angle: What's the point?

More Hz isn't BS as your topic line claims.

More Hz certainly isn't BS for 100% of use cases for 100% of population. :)

Certainly, the 120Hz monitor you tried didn't help you a whole lot, the non-strobed mode doesn't improve blur much, and its strobed mode had too many disadvantages (and wasn't 60Hz/classics-friendly). Doesn't mean 120Hz or 1000Hz is BS.
anothergol wrote:I don't think that 100, 200 or 300Hz are really an improvement as for the result.
Usually does not matter when playing most past games, and many current games optimized for 60fps. And certainly doesn't matter for things like coding/email/Word/etc. Even doesn't matter on many of FPS games today when using only midrange GPUs. And doesn't matter if you're happy with flicker/stepping effects (not everyone is).

But your statement isn't blanket for all possible use-cases (even many use cases that now exists today).

Remember, we're a (rapidly growing) niche website that's becoming more important for gaming-of-future issues.

You cannot claim your statement is blanket true, for everybody (even just two-sigma or three-sigma of world population, as many tests have shown), for every possible outlier use case. However, the outlier cases have become so common during VR that it's become necessary to do those workarounds (90Hz strobing). Motion blur from refresh rate limitations, is an increasingly common for future use-cases.

Certainly, your statement is certainly true for many current and past use cases.

But there are indeed already situations on my desktop where the 240Hz LCD monitor (TN disadvantages nonwithstanding) provides really plainly clear advantages in gaming. Like reduced stroboscopic effect during fast turning during fixed gaze at crosshairs during during 300+fps CS:GO gameplay, and I'm no eSports player myself at all -- it really is easier to aim that little bit faster. Even browser smooth scrolling (arrow keys) almost approaches CRT quality with roughly 1/4th motion blur of a 60Hz LCD, although it's not as clear as CRT, but hey -- I do code/browse/etc on my monitor so I consider it a bonus side effect secondary to the benefits to games. And on some strobed monitors I've tested, CRT-quality smooth scrolling of text in web browsers with no text ghosting/crosstalk. (Unfortunately, that's rare -- there's almost always strobe crosstalk on many displays). For web browser keyboard smooth-scrolling becoming CRT quality, all the LCD text advantages, with all the CRT advantages, is something found only on ~25% of the strobed monitors in the Official List of Best Gaming Monitors, but often there's color/brightness tradeoffs occuring at the same time. Ouch. (Hurry up, 240Hz OLEDs ;) )

Now when wearing Oculus Rift or HTC Vive, display refresh rates limitations become much more apparent if you turn off the low-persistence mode. Also when sitting in front row of an IMAX screen -- movie panning scenes is very motion-blurry and nauseates many people; same problem in VR. It's still a big problem at 100/200/300Hz whenever not strobing. Not YOUR "use case", I know. But motion blur also often headaches when playing on big-screen LCD displays on desktop. CRTs often don't have this problem because we don't put 27" or 30" retina-quality CRTs on our desktop, like we can with 27" and 30" LCDs. Motion blur from refresh rate limitations of a non-strobed display (whether be an LCD or OLED in non-strobe mode), is more problematic for big-FOV-coverage situations, like sitting in the front row of a movie theater. Sure, it might not bother one-sigma worth of population, but most people are bothered by the motion blur from sitting in the front-row of a movie theater (both from motion blur built into the film, and motion blur caused by tracking on panning scenery -- they are always additive onto each other, never subtractive). Displays are getting bigger, people are gaming on bigger displays at higher resolutions nowadays, and in more types use cases that makes those matters. In those cases, the "100/200/300Hz" limitations become more visible to more people.
anothergol wrote:If we all do agree that OLEDs/LCDs are still not there, that a CRT is still better for my own gaming, well there's no real argument anymore.
Fixed it for you.

Gaming 20 years from now won't be the same as gaming today.

When you can get those $50 Apple-Oakley(tm) VR Sunglasses at your Apple Store (with a self contained supercomputer discreetly inside the tiny frame), that might be the mainstream gaming of 2040s/2050s like like arcades were mainstream to 1980s.

Or if you care about input lag in certain types of games. Even as an amateur online FPS player. First-person shooters, playing today on TN 144Hz or 240Hz monitors that makes a real difference over 60Hz. Almost anything that real-time scans out at 240Hz (from CRT, from TN LCD, or from OLED), despite some of their imperfections, can still actually produce reaction time advantages even over a 60Hz CRTs by virtue of that. Simultaneous-draw situations (go around corner, suddenly see each other, react at same time, shoot each other at same time) with champion players is like crossing an Olympics finish line -- a few milliseconds can matter even if you can't feel the millisecond. The reaction time spread of good players can be so tight together, that an advantage of a few milliseconds (caused by higher refresh frequencies) can make you ahead of your competitors, too. Also, noteworthy is that many of those players use fixed-gaze (stare at crosshairs) so some of them don't care about eye-tracking motion blur (and thus, are mostly immune to tracking-based motion blur on sample-and-hold displays, as long as GtG pixel response is fast). However, not everyone play this way, obviously. That said, the billion-dollar eSports industry is rapidly growing under our noses, with million-follower and hundred-thousand-follower eSports players on Twitter, and full stadiums/venues are now occuring (multi-thousand spectators attending in person, watching champion players playing video games) in some of those 7-figure prize-pot leagues. In these leagues, the extra Hz is a lot more important than for the office worker. Already, today in many countries including the USA (Note: Blur Busters is located in Canada) there are more online viewers of eSports (via Twitch/etc) than of certain real sports (e.g. car racing, ski racing, etc) via television/streaming, to the jawdrop of the previous generation...

Or if you're needing zero-lag augumented reality where overlaid objects do not self-blur (blurrier than the background they're superimposed on top) during head turning situations. Or stroboscopically step, or lag behind. NVIDIA found out it was truly worthwhile to go roughly 16,000 Hz (video) to eliminate human-visible augumented-reality artifacts such as lag-behinds, defocus-effect (caused by motion blurring of limited Hz), or stroboscopic stepping (caused by non-preblended/blurred frames). There were AR sync imperfections even at 8,000 Hz... Lower Hz means more lagbehind when overlaying graphics on top of real world, and/or more tracking motion blur, and/or more stroboscopic stepping effects. You need all of that (all 3 solved simultaneously: lagbehind, blurring & stroboscopic stepping) to all simultaneously be solved for your human eye's maximum accurate eye tracking speed. In this situation, quintuple-digit refresh rates were actually necessary in these kinds of outlier edge cases. Many experiments at many laboratories (including other than NVIDIA) tested at many different refresh rates, many lower, and the imperfections progressively diminished visibly, but still weren't always consistently gone at quadruple-digit Hz for all possible use cases. Even at 16,000Hz, there was some imperfections (e.g. tracking overshoots) as seen in the video, but that part were caused by other limitations (e.g. inaccuracy of tracking detection, etc). Knocking-off all the weak links, obviously, is much more difficult at such insane Hz, though!

Everyone plays different kinds of games, and within games, they play that particular game differently (e.g. eye tracking gameplay tactics versus stationary-gaze gameplay tactics). And tomorrow's games (or next decade's games, or next century's games) won't have the same demands on displays then as they do today. And gaming at higher resolution+bigger FOV than yesterday. Etc. And yes, imagine the imaginary theoretical Apple-Oakeley teamup for VR sunglasses of 2040s, or whatever becomes the mainstream method of gaming in the 2040s/2050s, as one theoretical example. None of us can predict which direction gaming will go in.

Ten years ago, few of us expected smartphone gaming to exceed console/portable gaming. Who's to say that VR sunglasses won't be popular at Apple Store and Microsoft Store or Best Buy/Amazon for our grandkids, or some other theoretical example? And, even today, 240Hz already makes a huge difference to some of us today on our desktops in certain games.

Today's smartphones are bigger than yesterday's portable game screens. (e.g. 6" phones versus 2" Gameboys)
Today's TV screens are bigger than yesterday's TV screens. (e.g. 50" HDTVs versus 27" CRT televisions)
Today's monitors are bigger than yesterday's monitors. (e.g. 27" LCD monitors versus 19" CRT monitors)

And we've of course, got resolution progress (to Retina), more FOV too (because displays are sharp enough to sit closer to), and immense progress being made, Moore's Law style, on virtual reality displays (Oculus, Vive, GearVR, etc).

Remember that artifacts (stroboscopic/blurring effects) can cause some people to puke or feel sick when it's done at large FOV (e.g. front row seat in cinema, or Geforce Titan/Radeon Vega on 27-30" desktop monitor, or when wearing Rift/Oculus). Big whoop on 17" CRT playing MAME. Annoying (and sometimes barf-mania for some) with bigger displays. More people now sit closer to TVs -- or bigger TVs at same viewing distance -- bigger FOV either way. This is because while getting bigger, TVs have also gotten sharper (HDTV...) but the motion blur of LCD/OLED HDTVs also simultaneously forced many sports camera operators to try to track objects (e.g. balls, pucks, etc) more accurately, to prevent nauseating motion blur on big-screen HDTVs -- there are some people who get dizzy with big HDTVs with fast-scrolling scenery (partly due to tracking motion blur...)

Either way:

Even if we cannot predict tomorrow's gaming habits or even next century -- all of these are factors that affects the benefits of doing more Hz and/or sticking to strobing (as a workaround to avoid needing more fps & Hz).

Now, to requote a sentence (with some impromptu fixes):
anothergol wrote:If we all do agree that OLEDs/LCDs are still not there, that a CRT is still better for my own gaming, well there's no real argument anymore.
TL;DR: You cannot predict what type of video games tomorrow's kids will be playing 10 or 100 years from now where Hz can matter a great deal more. ;) And even today, it's already starting to matter. Thus, your sentence has been 'fixed' to reflect this.

Also, see my Part 1 if you haven't read it just yet. I'm also re-quoting Part 1 TL;DR for completeness' sake, with an added emphasis at the bottom:
Chief Blur Buster wrote:Part 1 TL;DR:
  1. Your maximum eye-tracking speed definitely affects the ideal maximum refresh rate to jump past "diminishing points of returns". However, display angular resolution relative to your eyes (and tracking time available) matters a great deal in determining the "perfect final frontier refresh rate where diminishing points of returns disappear".
  2. Size/FOV/resolution matters a lot.
    e.g. Smartphone -> small monitor -> big monitor -> IMAX or wall height TV -> narrow FOV VR -> wide FOV VR -> Holodeck.
  3. The bigger/sharper/wider FOV, the more time for tracking over more pixels across the display plane, the easier it is to notice imperfections such as blur (tracking across display plane) or stroboscopic stepping effects (gaze in fixed point of display plane).
  4. You cannot control whether a human decides to track or gaze. Thus, doing only blending isn't always a fix-all.
  5. This all doesn't matter if you're just doing email, reading news, editing source code, or writing docs. ;)
  6. Heck, this might not even matter for you, on displays you play on, running specific games (& gameplay tactics) you play today
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: [Overcoming LCD limitations] Big rant about LCD's & 120H

Post by Sparky » 01 Aug 2017, 16:21

Speaking of stroboscopic stepping artifacts, the most annoying/distracting place I've seen them is in tail lights/brake lights/roadside advertising displays while I'm driving, and I'm pretty sure the pulse frequency on many of those is over 1khz. I think the learning point there is that this will become an even bigger problem as max brightness/contrast ratio goes up.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: [Overcoming LCD limitations] Big rant about LCD's & 120H

Post by Chief Blur Buster » 01 Aug 2017, 16:46

Sparky wrote:Speaking of stroboscopic stepping artifacts, the most annoying/distracting place I've seen them is in tail lights/brake lights/roadside advertising displays while I'm driving, and I'm pretty sure the pulse frequency on many of those is over 1khz. I think the learning point there is that this will become an even bigger problem as max brightness/contrast ratio goes up.
Yes, I notice PWM phantom arrays of car brake lights up to around ~2KHz or thereabouts -- the phantom array effect. It doesn't bother everyone, but it does to a few. Some people don't pay attention, some people notice it immediately, and some people get bothered (even to eye pain) by it.

Indeed, if it's unusually bright white pinpoints moving at high speeds, with very short duty cycle (e.g. 10KHz with 99% OFF, 1% ON), the ~10KHz phantom arrays from 10KHz flicker becomes noticeable to me.

Speaking of which, LCD backlights can create phantom arrays via a PWM-dimmed backlight.

Image
(This can also occur in other use cases, like text scrolling or window dragging, in any situations where frame rate is not locked to strobe/flicker rate).

The good old fashioned 30fps@60Hz CRT is also a close-cousin of this phantom array effect (2-image effect) but can occur at 144fps(framerate)@144Hz(refresh)@832Hz(PWM) = motion artifact with 6 copies of images.

If you look at http://www.testufo.com#count=3 on a CRT (or strobe backlight LCD / pulsed rolling scan OLED) you can watch this phantom array effect in action at any flickerate. The half-framerate has two image copies, and the quarter-framerate has four image copies. The multiple image effects occur precisely because your eyes are in temporally different positions during each copied refresh cycle (repeated flicker).

Doubling refresh rate will halve the distance between the image copies, and this phantom array effect is still visible on 240Hz strobed displays.

On non-strobed displays, it's simply a continuous blur (longer blur trail for lower framerates on high-Hz displays) instead of point-sampled copies of images.

When framerates are running low though, judder/stutter/flicker becomes visible, but when framerates are running high enough, it all blends into blur or multi-image-copies (if strobing a frame multiple times).

For example:

15fps on 60Hz CRT = 4 copies in phantom array effect
20fps on 60Hz CRT = 3 copies in phantom array effect
30fps on 60Hz CRT = 2 copies in phantom array effect
60fps on 60Hz CRT = single copy, full clarity

18fps on 75Hz CRT = 4 copies in phantom array effect
25fps on 75Hz CRT = 3 copies in phantom array effect
37fps on 75Hz CRT = 2 copies in phantom array effect
75fps on 75Hz CRT = single copy, full clarity

30fps @ 120Hz strobed LCD = 4 copies in phantom array effect
40fps @ 120Hz strobed LCD = 3 copies in phantom array effect
60fps @ 120Hz strobed LCD = 2 copies in phantom array effect
120fps @ 120Hz strobed LCD = single copy, full clarity
(excluding additional faint or strong copies caused by strobe crosstalk from LCD GtG imperfections)

60fps @ 60Hz(sample-and-hold LCD) @ 360Hz PWM(backlight) = 6 copies in phantom array effect
60fps @ 60Hz(sample-and-hold LCD) @ 720Hz PWM(backlight) = 12 copies in phantom array effect

144fps @ 144Hz(sample-and-hold LCD) @ 432Hz PWM(backlight) = 3 copies in phantom array effect
144fps @ 144Hz(sample-and-hold LCD) @ 864Hz PWM(backlight) = 6 copies in phantom array effect

72fps @ 144Hz(sample-and-hold LCD) @ 432Hz PWM(backlight) = 6 copies in phantom array effect
72fps @ 144Hz(sample-and-hold LCD) @ 864Hz PWM(backlight) = 12 copies in phantom array effect
(There may be an additional (strong|faint) copy or few, from slower GtG that bleeds between refresh cycles)

The PWM stroboscopic stepping effect, all above, is noticeable in typical TestUFO speeds (960 pixels/second and 1440 pixels/second). To avoid that stroboscopic stepping effect during impulse-driven displays, you must have accurately-positioned refresh cycles (brand new frames) at all eye positions. That's why 120fps@240Hz CRT looks worse than 60fps@60Hz. You must keep frame rate matched with flicker rate to avoid the multiple image effect.

Blur reduction strobe backlights can mathematically thought of as "precision VSYNC-synchronized PWM" that is capable of 1 flash per refresh cycle; the recipe necessary for the CRT motion clarity experience (framerate == refreshrate == stroberate) on low-strobe-crosstalk LCDs.

On flickering displays (...whether (A) running at a frame rate below strobed refresh rate, or (B) flashing a backlight multiple times on the same refresh cycles -- either by PWM dimming or by double-strobing feature...), the number of copies of images is always:

duplicate images = (strobes that has occured on unique frame) divided by (unique frames shown per second)

So CRT 30fps@60Hz is always 2 copies for the half frame rate UFO at http://www.testufo.com .... But the math formula applies to everything else, too. Doubling unique frame rate (assuming enough refresh rate to show unique frames) will only halve the distance between the copies for a specific tracking speed at specific pixel density.

distance between duplicates = (pixels per second) / (unique frames shown per second)

This applies to CRT too, whether 30fps @ 60Hz, or 50fps @ 100Hz, or 25fps @ 75Hz

And that's also precisely way, anything 60fps (emulators, games, whatnot) never look good on most 120Hz strobed displays (that doesn't have a proper 60Hz strobed mode).

flicker rate of duplicates = (flicker rate) / (number of duplicates)

So for CRT 30fps@60Hz, each duplicate image flickers at 30Hz. For CRT 20fps@60Hz, 3 copies with each duplicate image flickering at 20Hz. For CRT 15fps@60Hz, 4 copies with each duplicate image flickering at 15Hz. CRT users can see this effect in http://www.testufo.com#count=3 (Alas, I should have added a 20fps UFO...). The lower the refresh rate, the more "flicker" each duplicate image.

Now it's possible for flicker to go above flicker fusion threshold. So at 144fps on 832Hz PWM dimming, the six copies won't have visible flickering because both 144Hz and 832Hz are above my flicker fusion threshold. But the six copies remain as a PWM-dimming side effect. (And if you stop PWM and shine continuously instead -- sample-and-hold OLED/LCD -- then it's simply tracking based motion blur -- filling in the gaps between duplicate-images caused by multiple-strobing-per-frame). To avoid motion blurring effect on sample-and-hold displays, you must have sharp point-sampled frames at all analog eye tracking positions -- and to do so without using black periods (strobing/impulsing/CRT), requires extreme framerates at extreme refresh rates.

Fortunately, LCDs that do PWM-dimming is getting less common. Apple avoids it completely now, and most manufacturers do now (thankfully!).

Monitor manufacturers have gone FlickerFree, ZeroFlicker, EyeCare, and other lingo, to denote a PWM-free backlight (a backlight that doesn't use PWM for dimming).

At the other extreme, PWM is intentionally used creatively to generate images. Like the flip-flopping of mirrors on a DLP projector. Human perceive a different kilohertz frequency as a different shade of grey, for example, because it averages temporally. Or those spinning LED reels that generate images, however, is a great scientific exercise on how high-PWM frequencies generates images -- like LED bike wheels that generates images -- those use thousands of Hertz while moving very fast simultaneously to create intentional phantom arrays as a screen (using few LEDs at high-speed-PWM to generate many pixels).

Anyway, at the very base level, many CRT users understand 30fps@60Hz creates two copies, whether http://www.testufo.com or http://www.testufo.com/framerates-marquee or http://www.testufo.com/framerates-text ... This will occur at any framerates below flicker rate.

The flicker might become above flicker fusion threshold, but the temporally-offset image copies (caused by tracking) will remain. (e.g. double-images still remain at 120fps@240Hz strobed on Zowie XL2540/XL2546 DyAc strobe backlight, one of the very few 240Hz strobe-backlights in the world... You really want 240fps@240Hz to reduce this effect to the minimum possible (only strobe crosstalk from GtG limitations) not via repeating strobed refresh cycles)

But only few people truly understands the mathematics of this. People like me, NVIDIA, scientists, researchers, etc) truly understand how PWM affects vision -- especially from a display perspective -- and how PWM science can be creatively taken advantage of in other situations (e.g. DLP mirror per-pixel PWM, bikewheel PWM via a spinning LED rod, etc) -- and how duplicates (from PWM) & motion blur (from non-PWM/non-strobe/non-impulse) are created by refresh rate limitations -- people like me who have a vastly accurate deep understanding of this stuff.

Now in real world, phantom array effects often exist with flickering light sources -- like shaking around an old LED alarm clock (that uses old PWM logic, illuminating one LED segment at a time) -- or of course, LED taillights on the freeway at dark, whenever you're flitting your eyes around. Doesn't bother everybody, but annoys others, and for a few, creates actual eye pain/discomfort (NOT from flicker -- since it's above flicker fusion threshold -- but from the phantom array effect). Every person is different, they don't see/hear/notice/etc the exact same way.

DLP rainbow effects is a form of the phantom array effect -- some people are very sensitive to the "DLP rainbow effect" where flitting eyes around very fast on a DLP projector screen (typically single-chip ones used for home/office use, rather than high-end 3-chip ones) can see temporal color-separation effects -- very easy to see in high-contrast scenery like small bright objects in night scenes in movies.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

anothergol
Posts: 11
Joined: 28 Jul 2017, 23:47

Re: [Overcoming LCD limitations] Big rant about LCD's & 120H

Post by anothergol » 02 Aug 2017, 14:05

Chief Blur Buster wrote: But there are indeed already situations on my desktop where the 240Hz LCD monitor (TN disadvantages nonwithstanding) provides really plainly clear advantages in gaming. Like reduced stroboscopic effect during fast turning during fixed gaze at crosshairs during during 300+fps CS:GO gameplay, and I'm no eSports player myself at all -- it really is easier to aim that little bit faster.

Yeah thanks for going deep in the details on this. (even though I've already acknowledged I now understand the tracking problem)

I had high hopes for OLED, though. Thinking that my next monitor, if not an LCD, would be an OLED. Well I guess I'll have to stick to my CRT a little longer.
But really, if an OLED ever comes out with proper 70Hz strobing and does as much as my CRT, I'll be more than happy.
Because really, for the few hours I've played FPSes (& I play this kind of game a lot) on the CFG70, I didn't see the diff with my CRT, or if I did, it was worse. Sometimes it probably had to do with the fact it was skipping frames & thus I wasn't in 120FPS anymore, even though the FPS I play the most isn't that demanding & I normally had a constant 120FPS most of the time.
In any case, it made my playing a little worse, it didn't improve it at all. This said, I'm 40, I don't have great reflexes anymore, so ok, -maybe- there are people out there for who it's an improvement. Well..


As for what gaming will be in 20 years, well I'm pretty convinced that the same types of games will still be there. People will still play the same kind of FPSes, only in 8k or more, and yes, most likely at 240Hz or more (whether it makes sense or not doesn't even matter anyaway. HDR is already at the door and it doesn't make sense either. Well, unless it's just another name for high-precision color depth. Which is ironic considering many monitors are still 6 bit).
I can't speak about VR because, while I'm sure it will eventually be everywhere, my short experience with it disgusted me. And it wasn't related to motion sickness or whatever, but to the shitty lenses (of the Gear VR - at least the one I own). Mine only has clarity pretty much in the dead center and all the rest is color-shifted blurry mess.

As for holodeck rooms, well let's be real, they will never end up in any home, for obvious reasons. But hey, I'm pretty sure they will, if done right, resurect arcade centers.

User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: [Overcoming LCD limitations] Big rant about LCD's & 120H

Post by RealNC » 03 Aug 2017, 01:15

Have you tried 80Hz? 85Hz?

That was my minimum even when I had a CRT. This hasn't changed with LCD.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

3dfan
Posts: 62
Joined: 26 Jan 2014, 06:09

Re: Big rant about LCD's & 120Hz BS

Post by 3dfan » 06 Aug 2017, 06:15

For 60Hz, the biggest problem is manufacturers have artificially disallowed low flicker rates on strobed LCDs.
(flicker discomfort, epilepsy liability concerns, user complaints, etc).

please dont take my comments in a bad personal manner and i dont mean being like self satisfying person or something like that, but there is no need to force users with that and i hope manufacturers reconsider and allow users to use 60hz on all strobed monitors at least with a warning, or by using some custom resolution utilities, (for example a crt monitor like the fw900 wont display any refresh rate below 80hz by default, but fortunately you can create resolutions based on 60hz by using custom resolution utilities or via graphics card control panel)
but please manufacturers dont disable that posibility permamently, not all people are necesary sensitive to 60hz flicker,

i have played a lot of limited 60fps engine games such some indie games, fighting games, emulators, and some games that my gpu cannot constantly handle above 60fps on crt monitor and i much prefer to play a constant or limited 60fps game at 60hz rather than 60fps at 70 100 or more hz since it gives an ugly stuttery feeling, and i have had no eye strain or health issues at 60hz at very long years of use and still.

i believe disabling the posibility to use 60hz on a modern strobed (single strobe, not that double strobe double image crap) monitor destroys big part the propose and dreams of finaly after so long wait to finally have a crt quality on current monitors many of us have grown leaf waiting for.

thatoneguy
Posts: 181
Joined: 06 Aug 2015, 17:16

Re: [Overcoming LCD limitations] Big rant about LCD's & 120H

Post by thatoneguy » 10 Aug 2017, 21:21

Chief Blur Buster wrote:[

flicker rate of duplicates = (flicker rate) / (number of duplicates)

So for CRT 30fps@60Hz, each duplicate image flickers at 30Hz. For CRT 20fps@60Hz, 3 copies with each duplicate image flickering at 20Hz. For CRT 15fps@60Hz, 4 copies with each duplicate image flickering at 15Hz. CRT users can see this effect in http://www.testufo.com#count=3 (Alas, I should have added a 20fps UFO...). The lower the refresh rate, the more "flicker" each duplicate image.
I still stick by my theory that if we can achieve a lower image persistence than the human eye can perceive(say for example 1 nanosecond) the multiple image effect will be eliminated at any refresh/framerate combo.
To achieve that kind of persistence at a good resolution we would need really high refresh rates(2000hz+ at minimum) to combat brightness loss.

User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: Big rant about LCD's & 120Hz BS

Post by RealNC » 11 Aug 2017, 07:03

3dfan wrote:i have played a lot of limited 60fps engine games such some indie games, fighting games, emulators, and some games that my gpu cannot constantly handle above 60fps on crt monitor and i much prefer to play a constant or limited 60fps game at 60hz rather than 60fps at
LCD 60Hz is much, much worse than CRT 60Hz. I was OK with 60Hz on CRT. On LCD: No. Freakin'. Way. After 2 minutes at most I have to shut it down, stand up, and go for some fresh air.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: Big rant about LCD's & 120Hz BS

Post by Sparky » 11 Aug 2017, 07:42

RealNC wrote: LCD 60Hz is much, much worse than CRT 60Hz. I was OK with 60Hz on CRT. On LCD: No. Freakin'. Way. After 2 minutes at most I have to shut it down, stand up, and go for some fresh air.
For me, 85hz is about the minimum refresh rate I can tolerate on a low persistence display. Anything below that is too flickery.

Haste
Posts: 326
Joined: 22 Dec 2013, 09:03

Re: [Overcoming LCD limitations] Big rant about LCD's & 120H

Post by Haste » 11 Aug 2017, 07:47

I'm even bothered by ULMB at 120Hz :/

Kinda sucks to be sensitive to flickering.
Monitor: Gigabyte M27Q X

Post Reply