Blur Busters Forums

Who you gonna call? The Blur Busters! For Everything Better Than 60Hz™ Skip to content

Why do [some] 240HZ monitors have more lag than 144HZ AHVA

Everything about input lag. Tips, testing methods, mouse lag, display lag, game engine lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.

Re: Why do [some] 240HZ monitors have more lag than 144HZ AH

Postby Chief Blur Buster » 09 May 2018, 23:57

Nonteheless, for many, minimum possible lag matters.

Also it may be muscle memory; being used to mouse lag for decades. And optimizing the heck out of it, makes aiming go off.

We're talking about only a 1ms, 2ms, 3ms here, but at a 8000 pixels/second 180-degree flick, 3ms latency change translates to 240 pixel overshoot/undershoot! (8000 x 0.03) = 240

So if you've calibrated your 180-degree flick to a specific mouse lag, even a single millisecond change will throw off your muscle-memory calibrated 180-degree flick. When you switch computers. From your home computer to a tournament/eSports setup. Then you need a few hours to practice on a different latency chain which might be slightly lower or higher.

Sometimes it takes a few days to re-train yourself to the new lag differential of your new setup. Or you might prefer to change a single factor (e.g. 1000 Hz -> 500Hz) which will rebalance things muscle-memory-wise.

Sometimes the humble millisecond, matters in surprising ways. Lower lag is normally better but -- sometimes the pesky millisecond gets you -- for a different reason like simply being pre-trained to a specific latency chain.

Note -- More study is welcome. I'm also recruiting writers who have written peer reviewed research, to do some basic research on various matters of input lag nature. Blur Busters commissioned the Input Lag and The Limits of Human Reflex.
Head of Blur Busters - | | Follow @BlurBusters on Twitter!
To support Blur Busters: Official List of Best Gaming Monitors | G-SYNC | FreeSync | Ultrawide
User avatar
Chief Blur Buster
Site Admin
Posts: 4863
Joined: 05 Dec 2013, 15:44

Re: Why do [some] 240HZ monitors have more lag than 144HZ AH

Postby user9581320004096748 » 19 May 2018, 06:13

xl2411p seems to be the only reasonable option from benqs recent lineup, when considering absolute lag. The signal lag seems higher on all other models, even affecting the xl2536 (144hz). I advise one to only choose any 240hz monitor for emulation and low frame rate gaming, improving latency regarding buffer occupancy and stutter. Old monitors scored well on playerwares, the xl2430t and xl2720z are very low latency 144hz options, which scored well at this site (1.3ms~). ... =52030207# ... =56290226# ... =56032599# ... =55048448# ... =53695484# ... =56313054#

After "upgrading" to the xl2540 from the xl2420te, the only thing that amazed me was the very low latency strobing when compared to lightboost.

The HDMI also accepts a 240p frame, and with sharpness on 10 provides an amazingly sharp image, all things considered. The xl2420te only accepts a 383p frame. The xl2540 is also the first 24in monitor from benq that offers 5:4 scaling, usually only offered on 27in. The build quality seems lower, as the backlight "blinks" if the chassy is knocked, leaving me to believe the engineers are no longer focusing on build quality. The xl2420te is a tank.

Overall I regret purchasing the monitor, since I only use it essentially as an limited function FPGA for low framerate or stuttery applications.

I will probably not purchase another benq monitor around 2021-22~ for my next upgrade, and will be probably be looking at the lowest latency offerings when operating at 160hz that offer much more flexibility, image quality, general purpose use, build quality and reliability. It will probably be my last gamer focused purchase regarding monitors, since LCD tech is not going to improve much from now on, and struggles to ever since the introduction of 5ms TN panels with no overdrive...
Posts: 3
Joined: 15 Mar 2018, 19:51

Re: Why do [some] 240HZ monitors have more lag than 144HZ AH

Postby Chief Blur Buster » 22 May 2018, 13:23

user9581320004096748 wrote:I will probably not purchase another benq monitor around 2021-22~ for my next upgrade, and will be probably be looking at the lowest latency offerings when operating at 160hz that offer much more flexibility, image quality, general purpose use, build quality and reliability. It will probably be my last gamer focused purchase regarding monitors, since LCD tech is not going to improve much from now on, and struggles to ever since the introduction of 5ms TN panels with no overdrive...

I visited CES 2018 and saw literally over 500 LCD displays including some of the best-ever LCDs I have ever seen that are not-gaming-oriented.

Actually, from what I saw of local-dimmed televisions, I've seen amazingly colorful LCDs that are also simultaneously fast.

Local dimming also enables scanning backlight operation that if done very well with minimal internal backlight diffusion -- can provide a have-cake-and-eat-it-too effect of blur reduction with great OLED-style-colors.

The problem is that we've kept getting bottom-barrel panels for so long. The LCDs that look 10x better often costs 10x as much, and we're finally getting it for the first time in history through the 4K 144Hz GSYNC HDR monitors with 384-zone local dimming. Videophile LCDs in gaming-market monitor formats have never happened before, and this is one of the first ones.

The image quality of these are literally an order of magnitude better from what I've seen at CES 2018 -- I was only able to briefly see the ASUS PG27UQ (prototype) at the ASUS showroom during a media-invitation-only event, and I liked what I saw. I was not able to enable ULMB (which exists) but it literally looked almost OLED-like compared to any LCD gaming monitor. I wish I had more time with it, but it's an amazing achievement with roughly 100x the firmware engineering put into this monitor compared to others.

Blur Busters reads scientific journals and monitor engineering documents, and have stared at LCDs that look 10x better than our desktop moni.tors already, so I disagree with you -- LCD improves massively with a bit more engineering put into it.

To make LCD jump an order of magnitude better than today's LCD monitors, you literally need the following:
  • Extreme brightness (1000+ nits, preferably 10,000+nits like Sony) to compensate for strobe dimness
  • Hundreds of zones for local dimming (384-zone minimum)
  • Antiblooming algorithm via inserting LCD gradients into the video signal (It's amazing amount of computing power)
  • Huge number of bits (10-bit or 12-bit) to prevent banding, especially from the antiblooming gradient algorithm
  • High dynamic range. Imagine inky black backgrounds with ultra-bright pinpricks of stars, or ultra-bright streetlamps.
  • Ultra-wide gamut.

This costs 10x as much -- like paying $1999 for a gaming monitor. These were common in high-end videophile LCD HDTVs, but are only finally coming to certain monitors.

I've seen some televisions at CES 2018 that I thought were OLED (ultra-bright flowers on completely total-black backgrounds, no blooming), and when I read the sign next to it....LCD. WOW.

For you computer programmers understanding of GPU shader programming / FPGAs / ASICs .... some amazing compute power is being put into solving the problem. Some of the amazing engineering in some amazing antibloom algorithms; I wrote in this ArsTechnica comment, which I will quote here, for your reference.

mdrejhon from ArsTechnica wrote:
ScifiGeek wrote:But I have ZERO desire to have >1000 nit displays. My 6 year old LCD only has a max of around 400 nits, and I have the backlight turned almost to minimum, since I like to watch movies the dark, and don't want the glare of the full sunlit outdoors shining on me when I am watching TV in dark room.

I wondered the same too, but after seeing HDR in action, I now understand that even 5,000 nits doesn't do that:

What happens with HDR content is that it judiciously uses that only for things like pinpoints of lights (e.g. streetlamps and neon signs in night scenes). Instead of that material being overexposed, it retains its full original color.

That way, even the prototype 10,000 nit TV that I saw at CES 2018 surprisingly wasn't too bright on the eyes, as it simply focussed the intensity only on the highlights like sunglints off a car, or the nuances of a neon sign in night scenes.

Those pinpoints of high brightnesses were careful not to jack-up the average picture level to uncomfortable levels in the dark. That way, HDR was totally amazing and wasn't hurting my eyes.

Your nickname is SciFiGeek. Imagine brighter stars and planet dots. Single-pixel high-brightnesses in a deep black background. Totally amazing looking space scenes. Star pinpoints brighter than you thought a display would be capable of, but not eye-hurting. Or those night scenes in Pixar's CoCo. Yet the daytime scenes aren't brighter at all (just a normal 50nit or 150nit or whatever preset comfy average picture level max), except the sunglints off sunglasses or car, are so much more realistic on a >2000nit HDR displays that I visited at CES 2018.

As the bright pinpoints increase in surface size, the nits-per-pixel falls dramatically, in a judicious way, till it all balances out. It was even apparent on both LCD and OLED, one edge of screen had perfect blackness, and the other edge had really good nits on the glints, so the local-dimming control in some 2018 panels were actually getting more impressive than I thought; despite "human eyes can only see 100:1 dynamic range", it was really very clearly apparent that the HDR range was really beneficial when the peaking was in the quad-digit range, without any overbearing APL levels.

You need to check out true HDR sometime (with careful >1000nit surgical-use-only peaking), it's totally different from what you thought; Most HDR displays don't milk HDR impressively nor properly, but once it is done well, it is rather impressive when properly calibrated for comfort.

When I saw over 100 different HDR displays at CES 2018, I realized what HDR really was, and it's important to think of the above contexts, and how HDR actually enhances things...

It's true that we have to spend quite a lot (at least two thousand dollars) to get a massively better LCD that literally looks 10x better than IPS LCD or about 100x better than TN non-GSYNC.

Currently, not even $500 or $700 for a monitor pays for the engineering (hardware engineering and software engineering) necessary to have a 10x better LCD that actually already exists in certain videophile HDTVs.

Tons of improvements in LCD are available but is extremely difficult to bring to a $300-or-less monitor.

I'm a software developer who also worked in the home theater industry before -- including a few firmwares for Runco scalers and other brands of high-end videophile stuff. As a fellow firmware programmer, who pays attention to what goes on in the monitor industry -- it is my duty to inform you there is a two-to-three order-of-magnitude number of human-hours in software engineering billable time between an entry level 144Hz computer monitor using off-the-shelf TN panel and off-the-shelf scaler, and a delicately engineered videophile LCD HDTV costing $5K-$20K.

We never got this much engineering in a modern LCD-panel gaming monitor until the upcoming $1999 4K 144Hz GSYNC HDR monitors, which, while a leap upwards in quality, still doesn't match the number of hours put into certain videophile LCD HDTVs (with super-intelligent antibloom algorithms) that I've seen and heard from engineering-time effort I've seen manufacturers put in.

But it comes miraculously close, no display in the computer monitor industry has ever gained that much engineering time -- local-dimmed quantum-dot LCDs are a totally a different ballpark that are much closer to OLED than LCD to my eyes. I have seen a few amazing locally-dimmed LCD HDTVs at CES 2018, and yes, I did see many OLED HDTVs, and yes, I do confirm that the quality venn diagram of OLED and LCD actually overlaps with fullblacks adjacent to superbrights. Some LCDs that I have seen, are THAT much a leap upwards above any 144Hz monitor currently sitting on your desk (Yes, RealNC, even your ViewSonic XG2703-GS IPS 165Hz GSYNC pales compared to some of the new OLED-quality LCDs that I have now feasted my eyes upon).

Blur Busters does tip a few dominoes from time to time. ULMB ultimately exists because of Blur Busters making LightBoost popular several years ago -- literally, we screamed from the rooftops about the virtues of strobe backlights, and the strobe feature ended up becoming much more popular for blur-reduction instead of 3D glasses (its original purpose), and has now been copycatted by many manufacturers. Blur Buster's LightBoost advocacy directly led to industry-wide adoption of motion blur reduction when monitor manufacturers wanted to copy LightBoost capabilities.

There is so much more we can do to LCDs over the long term (that can make them superior in some respects to OLED, due to less brightness limitation for thousand-zone-league LED local dimming being >10x brighter than OLED; very delicious for bright HDR strobing ability) -- it's just some of that are pretty rather expensive engineering steps even if easier in some respects than similarly-expensive OLED engineering.

One thing for sure, today's LCDs are more than 20x better than LCDs twenty years ago.

Problem is, the big jump beyond today's "struggle plateau" (2005-2015) is a bit cost-jump to reach "1000-zone local dimming + 12bit + HDR + antibloom system" of an OLED-matching LCDs that I saw at CES 2018.

Or, that 10K-nit Sony LCD. Simply stunning and vastly superior to OLED.

Yep, Chief Blur Busters says this: The best displays I saw at CES 2018 were .... LCD.

Shocked that I would say this?

At CES 2018 this year, none of the best prototype OLEDs impressed me nearly as much as the Sony 10K-nit 8K LCD with inky blacks and ultra dazzling sunglint reflections off leaves, polished cars, ultrabright streetlamps on fully detailed dark scenes, etc. Zero, zero, zero, zero blooming. None, nada, zilch, zip, zero, no blooming -- just like OLED. It didn't hurt my eyes either, see my above quoted posts.

Obviously, the Sony wasn't using its 10K-nit headroom for ULMB, but many of us recognize that potential: The great headroom of 10Knit allows 90%-blur-reduction having 1000-nit HDR strobed. Imagine that. And LED backlights can become brighter than OLED pixels themselves. I want OLED to win. I really do.

But the LCD horse will be around for decades to come (mark my words) simultaneously and concurrently with OLED.

The fallacy of assumptions of OLED replacing LCD is simply rooted in the fact that we've gotten bottom-barrel LCDs (even in a $700 monitor with amazing GSYNC/ULMB and amazing top-notch NVIDIA overdrive tuning...but it's still simply polishing a turd of a panel with no local dimming or HDR).

Simply put, $2000 of engineering premium put into an OLED versus $2000 of engineering premium put into an LCD ... actually produces a 10x better-quality LCD versus a 3x better-quality OLED in many respects.

Sometimes the horses will pull ahead, and other horses may fall behind. But some law-of-physics is bottlenecking OLED in ways that is not bottlenecking LCD as much (e.g. ultra-bight local dimming). Backlights are simply LEDs, today' LEDs light up the Super Bowl Football Stadium. Modern LED has gotten incredibly bright with almost no upper limit.

See where I am getting at? LEDs lights up stadiums! Brightness headroom is in LCD's favour. Law of physics stacked against OLED pixel brightness is hairpulling frustrating for motion blur reduction.

You need lots of overkill brightness to lower persistence while keeping HDR (unless you're doing blurless sample and hold).

You just need to throw a lot of sugar at it (high-bit-depth, many zone of local dimming, artificial-intelligence antiblooming, etc) in order to make LCD look as good as OLED (inky complete-blacks next to dazzling brights) like what I've already seen on the showroom floor and closed-door media invitations at CES 2018.

Certainly, buying the best OLED HDTV and buying the best LCD HDTV, and they do come close in many ways with some wins towards LCD and some wins towards OLED. But current LCD is winner winner chicken dinner from what I saw at CES 2018.

My prediction is both tech (LCD, OLED) will still be important 10 and 20 years from now. There's lots of improvement headroom for both. Computer monitors have simply only been on a plateau (2005-2015) while I've seen OLED-quality LCDs already at CES 2018.

mdrejhon from ArsTechnica wrote:
ScifiGeek wrote:I can certainly imagine the blooming that it would create on an LCD. Really you don't need 10000 nit stars.

They've improved a lot.

They do an amazing job of antiblooming algorithms on some 2018 panels. They actually do realtime LCD gradients on the panels to antibloom. And judiciously avoid quick peaking curves (e.g. a 2000 nit HDR QLED may never have stars more than 500 nits. A few pixels of water sun glints on a planet may actually be allowed to peak at 2000 because surrounding pixels are already 100 nits). As a programmer, some amazing FPGA / assembly / GPU shader sheningians are being done in some televisions. Realtime HDR processing to do some antibloom compensation by shading LCD panel brightness gradients inversely proportional to predicted bloom. To cancel out the blooming. This also intelligently avoid sharp brightness transitions in tight spaces, to avoid noticeable blooming (e.g. intelligently avoid 2000 nit pixels adjacent to 0 nit pixels). I'm surprised what some TV engineers already do in their programming.

There were two QLED and QLED-type locally dimmed TVs that I thought were OLED because I thought.... Man, LCD could possibly NOT do it. Believe me, I was surprised. That said, not all TVs on the showroom floor could do antibloom that well.

Disclaimer: Founder of Blur Busters/TestUFO. I have collaborated with some who work in display engineering

Not even a $900 high-end GSYNC monitor comes close to the OLED-quality LCDs I saw at CES 2018. Even the new 4K 144Hz displays won't be as good as the Sony 8K prototype 10k-nit LCD I saw, nor the TCL 1000-zone localdimming prototype I saw. Those OLED-quality LCDs are far beyond any LCD mortal sitting our computer desks, and the new 4K 144Hz only merely gives us a first-ever-time catapult to the starter videophile leagues. Far beyond the image quality we currently have today -- it's the first red-carpet straight into Kuro Plasma leagues of videophile lore -- but damn, it is expensive for many. Like a Ferrari.
Head of Blur Busters - | | Follow @BlurBusters on Twitter!
To support Blur Busters: Official List of Best Gaming Monitors | G-SYNC | FreeSync | Ultrawide
User avatar
Chief Blur Buster
Site Admin
Posts: 4863
Joined: 05 Dec 2013, 15:44

Re: Why do 240HZ monitors have more lag than 144HZ AHVA

Postby 1000WATT » 22 Jul 2018, 06:20

hkngo007 wrote:Hello,
Cons of the PG258q to IPS 165hz
- smaller screen - found it a lot harder to see enemies heads after coming from 27" 1440p (of course I got used to it after a couple weeks)

27 inch 1440p pixel pitch 0.233
24,5 inch 1080p pixel pitch 0.283
Как это возможно ? Каким образом на мониторе с шагом пикселя 0.28 стало труднее целиться в голову?
27 inch 1440p 0.233 ~ 20,2 inch 1080p 0.233
Posts: 1
Joined: 22 Jul 2018, 05:44


Return to Input Lag

Who is online

Users browsing this forum: No registered users and 2 guests