Why do [some] 240HZ monitors have more lag than 144HZ AHVA

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Why do [some] 240HZ monitors have more lag than 144HZ AH

Post by Chief Blur Buster » 09 May 2018, 23:57

Nonteheless, for many, minimum possible lag matters.

Also it may be muscle memory; being used to mouse lag for decades. And optimizing the heck out of it, makes aiming go off.

We're talking about only a 1ms, 2ms, 3ms here, but at a 8000 pixels/second 180-degree flick, 3ms latency change translates to 240 pixel overshoot/undershoot! (8000 x 0.03) = 240

So if you've calibrated your 180-degree flick to a specific mouse lag, even a single millisecond change will throw off your muscle-memory calibrated 180-degree flick. When you switch computers. From your home computer to a tournament/eSports setup. Then you need a few hours to practice on a different latency chain which might be slightly lower or higher.

Sometimes it takes a few days to re-train yourself to the new lag differential of your new setup. Or you might prefer to change a single factor (e.g. 1000 Hz -> 500Hz) which will rebalance things muscle-memory-wise.

Sometimes the humble millisecond, matters in surprising ways. Lower lag is normally better but -- sometimes the pesky millisecond gets you -- for a different reason like simply being pre-trained to a specific latency chain.

Note -- More study is welcome. I'm also recruiting writers who have written peer reviewed research, to do some basic research on various matters of input lag nature. Blur Busters commissioned the Input Lag and The Limits of Human Reflex.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

user9581320004096748
Posts: 3
Joined: 15 Mar 2018, 19:51

Re: Why do [some] 240HZ monitors have more lag than 144HZ AH

Post by user9581320004096748 » 19 May 2018, 06:13

xl2411p seems to be the only reasonable option from benqs recent lineup, when considering absolute lag. The signal lag seems higher on all other models, even affecting the xl2536 (144hz). I advise one to only choose any 240hz monitor for emulation and low frame rate gaming, improving latency regarding buffer occupancy and stutter. Old monitors scored well on playerwares, the xl2430t and xl2720z are very low latency 144hz options, which scored well at this site (1.3ms~).

http://playwares.com/index.php?mid=dpre ... l=52030207#
http://playwares.com/index.php?mid=dpre ... l=56290226#
http://playwares.com/index.php?mid=dpre ... l=56032599#
http://playwares.com/index.php?mid=dpre ... l=55048448#
http://playwares.com/index.php?mid=dpre ... l=53695484#
http://playwares.com/index.php?mid=dpre ... l=56313054#

After "upgrading" to the xl2540 from the xl2420te, the only thing that amazed me was the very low latency strobing when compared to lightboost.

The HDMI also accepts a 240p frame, and with sharpness on 10 provides an amazingly sharp image, all things considered. The xl2420te only accepts a 383p frame. The xl2540 is also the first 24in monitor from benq that offers 5:4 scaling, usually only offered on 27in. The build quality seems lower, as the backlight "blinks" if the chassy is knocked, leaving me to believe the engineers are no longer focusing on build quality. The xl2420te is a tank.

Overall I regret purchasing the monitor, since I only use it essentially as an limited function FPGA for low framerate or stuttery applications.

I will probably not purchase another benq monitor around 2021-22~ for my next upgrade, and will be probably be looking at the lowest latency offerings when operating at 160hz that offer much more flexibility, image quality, general purpose use, build quality and reliability. It will probably be my last gamer focused purchase regarding monitors, since LCD tech is not going to improve much from now on, and struggles to ever since the introduction of 5ms TN panels with no overdrive...

User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Why do [some] 240HZ monitors have more lag than 144HZ AH

Post by Chief Blur Buster » 22 May 2018, 13:23

user9581320004096748 wrote:I will probably not purchase another benq monitor around 2021-22~ for my next upgrade, and will be probably be looking at the lowest latency offerings when operating at 160hz that offer much more flexibility, image quality, general purpose use, build quality and reliability. It will probably be my last gamer focused purchase regarding monitors, since LCD tech is not going to improve much from now on, and struggles to ever since the introduction of 5ms TN panels with no overdrive...
I visited CES 2018 and saw literally over 500 LCD displays including some of the best-ever LCDs I have ever seen that are not-gaming-oriented.

Actually, from what I saw of local-dimmed televisions, I've seen amazingly colorful LCDs that are also simultaneously fast.

Local dimming also enables scanning backlight operation that if done very well with minimal internal backlight diffusion -- can provide a have-cake-and-eat-it-too effect of blur reduction with great OLED-style-colors.

The problem is that we've kept getting bottom-barrel panels for so long. The LCDs that look 10x better often costs 10x as much, and we're finally getting it for the first time in history through the 4K 144Hz GSYNC HDR monitors with 384-zone local dimming. Videophile LCDs in gaming-market monitor formats have never happened before, and this is one of the first ones.

The image quality of these are literally an order of magnitude better from what I've seen at CES 2018 -- I was only able to briefly see the ASUS PG27UQ (prototype) at the ASUS showroom during a media-invitation-only event, and I liked what I saw. I was not able to enable ULMB (which exists) but it literally looked almost OLED-like compared to any LCD gaming monitor. I wish I had more time with it, but it's an amazing achievement with roughly 100x the firmware engineering put into this monitor compared to others.

Blur Busters reads scientific journals and monitor engineering documents, and have stared at LCDs that look 10x better than our desktop moni.tors already, so I disagree with you -- LCD improves massively with a bit more engineering put into it.

To make LCD jump an order of magnitude better than today's LCD monitors, you literally need the following:
  • Extreme brightness (1000+ nits, preferably 10,000+nits like Sony) to compensate for strobe dimness
  • Hundreds of zones for local dimming (384-zone minimum)
  • Antiblooming algorithm via inserting LCD gradients into the video signal (It's amazing amount of computing power)
  • Huge number of bits (10-bit or 12-bit) to prevent banding, especially from the antiblooming gradient algorithm
  • High dynamic range. Imagine inky black backgrounds with ultra-bright pinpricks of stars, or ultra-bright streetlamps.
  • Ultra-wide gamut.
This costs 10x as much -- like paying $1999 for a gaming monitor. These were common in high-end videophile LCD HDTVs, but are only finally coming to certain monitors.

I've seen some televisions at CES 2018 that I thought were OLED (ultra-bright flowers on completely total-black backgrounds, no blooming), and when I read the sign next to it....LCD. WOW.

For you computer programmers understanding of GPU shader programming / FPGAs / ASICs .... some amazing compute power is being put into solving the problem. Some of the amazing engineering in some amazing antibloom algorithms; I wrote in this ArsTechnica comment, which I will quote here, for your reference.
mdrejhon from ArsTechnica wrote:
ScifiGeek wrote:But I have ZERO desire to have >1000 nit displays. My 6 year old LCD only has a max of around 400 nits, and I have the backlight turned almost to minimum, since I like to watch movies the dark, and don't want the glare of the full sunlit outdoors shining on me when I am watching TV in dark room.
I wondered the same too, but after seeing HDR in action, I now understand that even 5,000 nits doesn't do that:

What happens with HDR content is that it judiciously uses that only for things like pinpoints of lights (e.g. streetlamps and neon signs in night scenes). Instead of that material being overexposed, it retains its full original color.

That way, even the prototype 10,000 nit TV that I saw at CES 2018 surprisingly wasn't too bright on the eyes, as it simply focussed the intensity only on the highlights like sunglints off a car, or the nuances of a neon sign in night scenes.

Those pinpoints of high brightnesses were careful not to jack-up the average picture level to uncomfortable levels in the dark. That way, HDR was totally amazing and wasn't hurting my eyes.

Your nickname is SciFiGeek. Imagine brighter stars and planet dots. Single-pixel high-brightnesses in a deep black background. Totally amazing looking space scenes. Star pinpoints brighter than you thought a display would be capable of, but not eye-hurting. Or those night scenes in Pixar's CoCo. Yet the daytime scenes aren't brighter at all (just a normal 50nit or 150nit or whatever preset comfy average picture level max), except the sunglints off sunglasses or car, are so much more realistic on a >2000nit HDR displays that I visited at CES 2018.

As the bright pinpoints increase in surface size, the nits-per-pixel falls dramatically, in a judicious way, till it all balances out. It was even apparent on both LCD and OLED, one edge of screen had perfect blackness, and the other edge had really good nits on the glints, so the local-dimming control in some 2018 panels were actually getting more impressive than I thought; despite "human eyes can only see 100:1 dynamic range", it was really very clearly apparent that the HDR range was really beneficial when the peaking was in the quad-digit range, without any overbearing APL levels.

You need to check out true HDR sometime (with careful >1000nit surgical-use-only peaking), it's totally different from what you thought; Most HDR displays don't milk HDR impressively nor properly, but once it is done well, it is rather impressive when properly calibrated for comfort.

When I saw over 100 different HDR displays at CES 2018, I realized what HDR really was, and it's important to think of the above contexts, and how HDR actually enhances things...
It's true that we have to spend quite a lot (at least two thousand dollars) to get a massively better LCD that literally looks 10x better than IPS LCD or about 100x better than TN non-GSYNC.

Currently, not even $500 or $700 for a monitor pays for the engineering (hardware engineering and software engineering) necessary to have a 10x better LCD that actually already exists in certain videophile HDTVs.

Tons of improvements in LCD are available but is extremely difficult to bring to a $300-or-less monitor.

I'm a software developer who also worked in the home theater industry before -- including a few firmwares for Runco scalers and other brands of high-end videophile stuff. As a fellow firmware programmer, who pays attention to what goes on in the monitor industry -- it is my duty to inform you there is a two-to-three order-of-magnitude number of human-hours in software engineering billable time between an entry level 144Hz computer monitor using off-the-shelf TN panel and off-the-shelf scaler, and a delicately engineered videophile LCD HDTV costing $5K-$20K.

We never got this much engineering in a modern LCD-panel gaming monitor until the upcoming $1999 4K 144Hz GSYNC HDR monitors, which, while a leap upwards in quality, still doesn't match the number of hours put into certain videophile LCD HDTVs (with super-intelligent antibloom algorithms) that I've seen and heard from engineering-time effort I've seen manufacturers put in.

But it comes miraculously close, no display in the computer monitor industry has ever gained that much engineering time -- local-dimmed quantum-dot LCDs are a totally a different ballpark that are much closer to OLED than LCD to my eyes. I have seen a few amazing locally-dimmed LCD HDTVs at CES 2018, and yes, I did see many OLED HDTVs, and yes, I do confirm that the quality venn diagram of OLED and LCD actually overlaps with fullblacks adjacent to superbrights. Some LCDs that I have seen, are THAT much a leap upwards above any 144Hz monitor currently sitting on your desk (Yes, RealNC, even your ViewSonic XG2703-GS IPS 165Hz GSYNC pales compared to some of the new OLED-quality LCDs that I have now feasted my eyes upon).

Blur Busters does tip a few dominoes from time to time. ULMB ultimately exists because of Blur Busters making LightBoost popular several years ago -- literally, we screamed from the rooftops about the virtues of strobe backlights, and the strobe feature ended up becoming much more popular for blur-reduction instead of 3D glasses (its original purpose), and has now been copycatted by many manufacturers. Blur Buster's LightBoost advocacy directly led to industry-wide adoption of motion blur reduction when monitor manufacturers wanted to copy LightBoost capabilities.

There is so much more we can do to LCDs over the long term (that can make them superior in some respects to OLED, due to less brightness limitation for thousand-zone-league LED local dimming being >10x brighter than OLED; very delicious for bright HDR strobing ability) -- it's just some of that are pretty rather expensive engineering steps even if easier in some respects than similarly-expensive OLED engineering.

One thing for sure, today's LCDs are more than 20x better than LCDs twenty years ago.

Problem is, the big jump beyond today's "struggle plateau" (2005-2015) is a bit cost-jump to reach "1000-zone local dimming + 12bit + HDR + antibloom system" of an OLED-matching LCDs that I saw at CES 2018.

Or, that 10K-nit Sony LCD. Simply stunning and vastly superior to OLED.

Yep, Chief Blur Busters says this: The best displays I saw at CES 2018 were .... LCD.

Shocked that I would say this?

At CES 2018 this year, none of the best prototype OLEDs impressed me nearly as much as the Sony 10K-nit 8K LCD with inky blacks and ultra dazzling sunglint reflections off leaves, polished cars, ultrabright streetlamps on fully detailed dark scenes, etc. Zero, zero, zero, zero blooming. None, nada, zilch, zip, zero, no blooming -- just like OLED. It didn't hurt my eyes either, see my above quoted posts.

Obviously, the Sony wasn't using its 10K-nit headroom for ULMB, but many of us recognize that potential: The great headroom of 10Knit allows 90%-blur-reduction having 1000-nit HDR strobed. Imagine that. And LED backlights can become brighter than OLED pixels themselves. I want OLED to win. I really do.

But the LCD horse will be around for decades to come (mark my words) simultaneously and concurrently with OLED.

The fallacy of assumptions of OLED replacing LCD is simply rooted in the fact that we've gotten bottom-barrel LCDs (even in a $700 monitor with amazing GSYNC/ULMB and amazing top-notch NVIDIA overdrive tuning...but it's still simply polishing a turd of a panel with no local dimming or HDR).

Simply put, $2000 of engineering premium put into an OLED versus $2000 of engineering premium put into an LCD ... actually produces a 10x better-quality LCD versus a 3x better-quality OLED in many respects.

Sometimes the horses will pull ahead, and other horses may fall behind. But some law-of-physics is bottlenecking OLED in ways that is not bottlenecking LCD as much (e.g. ultra-bight local dimming). Backlights are simply LEDs, today' LEDs light up the Super Bowl Football Stadium. Modern LED has gotten incredibly bright with almost no upper limit.

See where I am getting at? LEDs lights up stadiums! Brightness headroom is in LCD's favour. Law of physics stacked against OLED pixel brightness is hairpulling frustrating for motion blur reduction.

You need lots of overkill brightness to lower persistence while keeping HDR (unless you're doing blurless sample and hold).

You just need to throw a lot of sugar at it (high-bit-depth, many zone of local dimming, artificial-intelligence antiblooming, etc) in order to make LCD look as good as OLED (inky complete-blacks next to dazzling brights) like what I've already seen on the showroom floor and closed-door media invitations at CES 2018.

Certainly, buying the best OLED HDTV and buying the best LCD HDTV, and they do come close in many ways with some wins towards LCD and some wins towards OLED. But current LCD is winner winner chicken dinner from what I saw at CES 2018.

My prediction is both tech (LCD, OLED) will still be important 10 and 20 years from now. There's lots of improvement headroom for both. Computer monitors have simply only been on a plateau (2005-2015) while I've seen OLED-quality LCDs already at CES 2018.
mdrejhon from ArsTechnica wrote:
ScifiGeek wrote:I can certainly imagine the blooming that it would create on an LCD. Really you don't need 10000 nit stars.
They've improved a lot.

They do an amazing job of antiblooming algorithms on some 2018 panels. They actually do realtime LCD gradients on the panels to antibloom. And judiciously avoid quick peaking curves (e.g. a 2000 nit HDR QLED may never have stars more than 500 nits. A few pixels of water sun glints on a planet may actually be allowed to peak at 2000 because surrounding pixels are already 100 nits). As a programmer, some amazing FPGA / assembly / GPU shader sheningians are being done in some televisions. Realtime HDR processing to do some antibloom compensation by shading LCD panel brightness gradients inversely proportional to predicted bloom. To cancel out the blooming. This also intelligently avoid sharp brightness transitions in tight spaces, to avoid noticeable blooming (e.g. intelligently avoid 2000 nit pixels adjacent to 0 nit pixels). I'm surprised what some TV engineers already do in their programming.

There were two QLED and QLED-type locally dimmed TVs that I thought were OLED because I thought.... Man, LCD could possibly NOT do it. Believe me, I was surprised. That said, not all TVs on the showroom floor could do antibloom that well.

Disclaimer: Founder of Blur Busters/TestUFO. I have collaborated with some who work in display engineering
Not even a $900 high-end GSYNC monitor comes close to the OLED-quality LCDs I saw at CES 2018. Even the new 4K 144Hz displays won't be as good as the Sony 8K prototype 10k-nit LCD I saw, nor the TCL 1000-zone localdimming prototype I saw. Those OLED-quality LCDs are far beyond any LCD mortal sitting our computer desks, and the new 4K 144Hz only merely gives us a first-ever-time catapult to the starter videophile leagues. Far beyond the image quality we currently have today -- it's the first red-carpet straight into Kuro Plasma leagues of videophile lore -- but damn, it is expensive for many. Like a Ferrari.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

1000WATT
Posts: 391
Joined: 22 Jul 2018, 05:44

Re: Why do 240HZ monitors have more lag than 144HZ AHVA

Post by 1000WATT » 22 Jul 2018, 06:20

hkngo007 wrote:Hello,
Cons of the PG258q to IPS 165hz
- smaller screen - found it a lot harder to see enemies heads after coming from 27" 1440p (of course I got used to it after a couple weeks)
27 inch 1440p pixel pitch 0.233
24,5 inch 1080p pixel pitch 0.283
Как это возможно ? Каким образом на мониторе с шагом пикселя 0.28 стало труднее целиться в голову?
27 inch 1440p 0.233 ~ 20,2 inch 1080p 0.233
I often do not clearly state my thoughts. google translate is far from perfect. And in addition to the translator, I myself am mistaken. Do not take me seriously.

klamanthia
Posts: 1
Joined: 14 Sep 2018, 07:54

Re: Why do [some] 240HZ monitors have more lag than 144HZ AH

Post by klamanthia » 14 Sep 2018, 07:56

Chief Blur Buster wrote:
user9581320004096748 wrote:I will probably not purchase another benq monitor around 2021-22~ for my next upgrade, and will be probably be looking at the lowest latency offerings when operating at 160hz that offer much more flexibility, image quality, general purpose use, build quality and reliability. It will probably be my last gamer focused purchase regarding monitors, since LCD tech is not going to improve much from now on, and struggles to ever since the introduction of 5ms TN panels with no overdrive...
I visited CES 2018 and saw literally over 500 LCD displays including some of the best-ever LCDs I have ever seen that are not-gaming-oriented.
Excellent post. Thanks for the summary and thoughts. Only comment would be strange that we didn't see any mention of microLED and how it could change the things, esp wrt to backlights

User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Why do [some] 240HZ monitors have more lag than 144HZ AH

Post by Chief Blur Buster » 14 Sep 2018, 14:45

I had written a little bit about the Wall at CES 2018 and right now it's more for large-format displays. The display was stunning in its own way and was able to output lumens galore, much brighter than OLED.

Imagine stadium/jumbotron LEDs being massively miniaturized into a smaller display. But you'll still see the LED dots/textures from 3 or 4 feet away, and sometimes seams between LED matrixes. That's apparently why they had a rope in front of "The Wall" MicroLED display to prevent people from getting closer than approximately 5 feet(ish).

I think more work will be needed over 5-10 years before MicroLED can be used in computer monitor formats for close viewing from inches away, so it's a nonstarter for desktop use through the end of 2020s I think. It'll definitely be important, but MicroLED simply brings the stadium jumbotron into the living room, but isn't refined enough to bring it to phones/tablets/monitor formats.

Expect enhanced monitors (e.g. cheaper and better versions of PG27UQ) that have QLED backlights, as well as local dimming capabilities, to gradually mature. And possibly OLED gaming monitors. These will all happen first before MicroLED, though some of the tech will take several years/a decade to become cheap. Acer did announce a cheaper 4K 144Hz FreeSync monitor for $899 with DisplayHDR400 certification.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

GAMEBOY
Posts: 5
Joined: 01 Jan 2019, 03:47

Re: Why do [some] 240HZ monitors have more lag than 144HZ AH

Post by GAMEBOY » 05 Jan 2019, 00:21

I actually based my decision off of that TFTcentral bar graph and bought the PG279q. After months of struggling with input lag and optimizing the PC itself, I've come to the point where the monitor should be the last major source of lag. The lag on this monitor (pg279q) is noticeable, and I regret buying it for that reason. The 165hz mode seems laggier than the 144hz mode. (Based off of feel)

I'm not sure if click-to-LED tests are ideal for testing input lag as the most noticeable version of lag is the cursor/crosshairs lagging behind actual mouse movement.

Is TFT central's graph correct? I can't imagine the PG258Q having worse input lag at 240hz than this(pg279q) monitor. Any clarification appreciated.

Post Reply