Critiquing Reviewers Testing of Input Lag

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
Post Reply
User avatar
r0ach
Posts: 95
Joined: 10 Oct 2023, 14:45

Critiquing Reviewers Testing of Input Lag

Post by r0ach » 04 Jan 2024, 19:16

Reviewers like Monitor's Unboxed are putting out reviews with every monitor having abysmally small input lag numbers like 0.3ms processing lag with the exception of the Samsung G7 and a random Zowie they say has 2.0ms processing lag. I've used numerous 1080p and 1440p 240hz panels and each one has wildly different cursor movement compared to one another.

The Asus VG27AQML1A and Acer xv272u W2 for instance both have slug cursor while surprisingly lots of low budget Chinese panels such as the Koorui 27E3QK, Innocn 27G1s, and Innoview 240hz all feel more responsive. I've only seen a single monitor reviewer that is showing any type of real differences in 240hz panel latencies and he seems to come up with wildly different latency figures than other people do. Is this person called "The Display Guy" using a better or worse method?

The crux of the problem is if you say every monitor has an input lag of 0.3ms (besides G7 and one Zowie) then it becomes not a very useful metric. On the flip side, if you measure end to end latency via click to photon, your tool might highly penalize VA panels, when in reality, even if their black transitions might be 20+ ms slower it doesn't actually feel like you've added that latency to mouse response. In other words, if you have a chart that says VA panels are going to feel 3x more laggy than an IPS then it's probably not going to be a valid chart since the end user does not have that experience.

Image

Then of course you have other problems like turning off the annoying red lights on the Innocn 27G1s improves subjective mouse movement a lot, while turning off the annoying blue LED on the Asus VG27AQML1A gives you clown cursor instead. The drastic effect some of these settings have is basically the equivalent of adding or removing an entire frame of input lag or more in practice. Same thing with settings like DDC/CI, auto input selection on/off, etc.

I'm finding I have to keep several of these settings at default while others have to be changed, otherwise aim just feels awful, and many monitors have no 'good' combination of settings to give you acceptable mouse movement at all. I typically never have issues like this using Samsungs, while brands such as Asus seem like a textbook example of how not to create a good monitor firmware. It's easier to aim on probably every generic Chinese 240hz panel I've used than an Asus or Acer.

It seems like bad code and feature creep has destroyed the name brand panels and they need to remove all the useless bloat in them. Even changing between something like "racing mode" and some other image mode on an Asus gives you different cursor movement. How about just delete all these useless modes and give only a single mode that actually works for gaming at all?

Parker
Posts: 1
Joined: 04 Jan 2024, 21:35

Critiquing Reviewers Testing of Input Lag

Post by Parker » 04 Jan 2024, 21:41

Would you recommend the Koorui 27E3QK over some of the $350-$500 options? I can’t seem to find any good reviews of it that aren’t sponsored content.

User avatar
r0ach
Posts: 95
Joined: 10 Oct 2023, 14:45

Critiquing Reviewers Testing of Input Lag

Post by r0ach » 05 Jan 2024, 19:28

Parker wrote:
04 Jan 2024, 21:41
Would you recommend the Koorui 27E3QK over some of the $350-$500 options? I can’t seem to find any good reviews of it that aren’t sponsored content.
This recommendation is probably going to vary a lot per people's setup, so let's get that out of the way. I currently use 7800xt + stripped AMD drivers installed in "driver only mode." Mouse movement is much lighter than new Nvidia drivers to the point where 800 DPI on Nvidia at 1080p feels meh in responsiveness, while the same Superlight at 800 DPI on AMD drivers feels almost overly fast at 800 DPI and feels just right at 1440p.

Using that Koorui on my setup is some of the best mouse movement I've seen where it's virtually effortless to aim where I can load up a game and I completely ignore aiming at the body of opponents and only aim at the head as a valid tactic. Some other monitors that feel less responsive you basically just aim at the moving blob of an opponent and spray them down all over.

So what's the problem with the Koorui then? On my setup having the monitor's default settings of things like "HDR: Auto" (instead of default off like most normal monitors) and Aspect Ratio: Auto (instead of full like most normal monitors) works great, then changing these settings suddenly gives you clown cursor. Okay, who cares. Maybe it interacts with Windows or the AMD drivers in a bad way and just leave it on auto and forget about it.

But then after using the monitor for an hour or so you notice it has some eye strain problems (weird effect I haven't seen on other panels where it makes you want to open your eyes to maximum while looking at it). On monitors like the XG27AQMR it doesn't have PWM by default but if you turn HDR on for that monitor it gives you PWM at brightness levels of 225cdm2 or lower. It's possible the weird eye effect on the Koorui is due to the "HDR: Auto" setting causing some type of PWM, but if you turn it off (with my 7800xt setup at least), cursor movement goes from the most light and linear you've seen to clown cursor.

The image quality is also not that great compared to the other 5 or 6 1440p 240hz panels I've tested (might be fixable with calibration, not sure). Tones that are suppose to be light and pop out on an image will be flat, etc, and doesn't help that the panel is also probably only doing 950-1000 contrast while many other 1440p's are doing 1350 now. So, TLDR, if you use stripped AMD drivers and want the best aim in the world (27g1s also has good mouse movement but even more eye strain on my setup), then buy the Koorui. It's kind of a creeping effect where I can personally look at the thing for an hour or so before you start noticing it's getting annoying. Only new panels I've tried that don't seem to have these eye problems are the BOE's which are slower than AU panels and generally don't look as good as either a good LG or AU.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Critiquing Reviewers Testing of Input Lag

Post by Chief Blur Buster » 05 Jan 2024, 21:37

r0ach wrote:
04 Jan 2024, 19:16
"The input lag conspiracy"
I edited that out.
No topic headlines of this EXTENT of flamebaiting, please.

Before: Former topic title was "The input lag conspiracy".
After: New topic headline is "Critiquing Reviewers Testing of Input Lag"

We generally discourage reviewer "politic baiting" topic headlines at Blur Busters, so we've edited the topic title, and we've moved this to the Lag forum.

Obviously, reviewers are not perfect, but as display testing inventions are essentially Blur Busters bread and butter, we criticize/critique reviewers responsibly without antagonizing them with Red-Blue style 2020s-style politics.

Respecting Blur Busters = Respect that Blur Busters tries to scold/educate reviewers in a "walk the fine line without making enemies" way. As many know, we invent things that reviewers use. So obviously, it's in my interest to just push the criticism needle as far as I can go -- without creating new enemies. :D ;)

Sorry not sorry. I do not want to see people use Blur Busters as a bait-headline platform for these "2020s era politicizing" salvos of painting reviewers as being part of a conspiracy, when the science is really sadly unsurprisingly mudane (see below).

____
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Critiquing Reviewers Testing of Input Lag

Post by Chief Blur Buster » 05 Jan 2024, 22:01

Blur Busters School Time

I cannot confirm if the numbers are accurate, but I can confirm that some new digital displays are now successfully detecting <1ms numbers now (correct + realistic)

My biggest problem with reviewers (Notice: I'm not afraid to criticize reviewers on a 50%:50% good:bad critique ratio), is they DO NOT ALWAYS DISCLOSE LAG TESTING METHODS. M'kay? That's my gigantic beef with reviewers, as much as I love helping the reviewers. I can be an ally but I can also criticize/scold, for sure!

But it's no conspiracy (so the fake topic title was edited).

Just various incompatible (but generally legitimate, unless there's errors) lag testing methods.

e.g. VSYNC ON lag testers, versus VSYNC OFF lag testers
e.g. Lag stopwatching at GtG 2% versus 10% vs 50% vs 90% vs 99% vs 100%

Now, I can confirm that SOME displays will correctly have low input lag numbers at the very top edge (scanout position zero), for a sensor-at-top. CRTs will generally have 0ms over there, and some of the world's fastest displays will manage to get pretty close to that.

Another method is an ultra-high-framerate VSYNC OFF system (e.g. FSE output at >3000fps) will generally produce lag numbers on a CRT tube that averages to half a frameslice time. So 3000fps output to a CRT at zero overhead, will have 0.5/3000sec latency (middle of frameslice latency). GTX and RTX cards can output blank rectangles at roughly 10,000fps, so lag testing <0.3ms is actually definitely possible.

Latency measurement methods and latency mathematics is tricky.

The Major Latency Stopwatching Variables

This is a dumb simplification for the obvious dunces around here -- winky winky -- you know who you are.
The smart ones will figure this out, but I have to properly science this out like a professional.
So here goes...

Sensor Position

- Reviewers don't always disclose sensor position. Sensor at top edge will always have lowest lag numbers, due to scanout latency. Please see high speed videos of scanout, www.blurbusters.com/scanout ... A perfect zero latency CRT display will have literally 0ms at top edge (if you can get the sensor photodiode onto Scan Line #1, and the sensor doesn't have latency processing delay)
- Reviewers don't always disclose stopwatch start/stop. Stopwatch start is usually easier to determine than stopwatch stop, because stopwatch stop is based on pixel response speed and threshold chosen.
...Sometimes it's an absolute nits threshold ("stop lag stopwatch when pixel response cause pixel to finally hit XX nits")
...Sometimes it's a range to give allowance for GtG pixel response (TFTCentral)
...Sometimes it's a GtG percentage threshold. Sometimes it's GtG2% (RTINGS). Sometimes it's GtG50%. Sometimes it's GtG100%. Remember, your eyes sees GtG50% as a grey color in the journey from black to white, so YOU WILL REACT before GtG100%. Waiting for GtG100% is just pointless, given most of the lumens are finished beginning to hit your eyes well before then. You just have worse ghosting (less refresh rate compliance, etc). The engineer holy war (Apple-vs-Android style, Mac-vs-PC style) on where to stop lag stopwatching (GtG % threshold) is still pretty contentious behind the scenes, even today. VESA uses a GtG 10%->90% threshold which I frequently complain about, and I've been historically a big fan of GtG 10% as the standard threshold to sync up with this, but I'm also a fan of GtG 50% (gamma-corrected). I also like just informing ranges. I just want to improve reviewer disclosure.
- Sometimes it's "first anything" reactions - high speed video camera testing, instead of single-pixel testing. Since not all pixels refresh at the same time, this is like a race to see "which pixel refreshes first?". This is useful when you're doing button-to-photons tests, like seeing a muzzle flash or a mouseturn reaction; because YOU as an ESPORTS player, will react to the first major stimuli (e.g. enemy movement, etc). We did this for GSYNC-101 as we were the world's first people to measure latency of GSYNC in January 2014. However; it can be highly problematic in comparing displays (and not 1:1 comparision to other sync technologies, especially if stimuli you need is near top or bottom on a different display, or you change sync technology).

VSYNC ON Lag Testers

- Example: Leo Bodnar Lag Tester
- For VSYNC ON lag tester on most signal=panel synchronized scanout (not all displays scanout the same speed on cable and panel), you have a TOP<CENTER<BOTTOM effect (strobing off) and TOP=CENTER=BOTTOM (strobing on without crosstalk), although that will vary quite badly if you have worse crosstalk at top and/or bottom edge (since the lag test will trigger on the first duplicate image, even if the first duplicate is fainter than the next duplicate).
- These lag numbers are good for console players, VR players, or people who prefer VSYNC ON
- VSYNC ON lag tester (e.g. Leo Bodnar) = stopwatch starts at VBI, preferably end of VBI (last Porch scanline) for easier comparision with VRR, but Microsoft Present() API pageflips at beginning of VBI (next scanline offscreen after final visible scanline).

VSYNC OFF Lag Testers

- Example: SMTT or RTINGS lag tester device
- VSYNC OFF stopwatch starts at time of Present()
- Latency is always lowest right below a tearline, and highest right above a tearline. Average latency of randomized tearing is half a frameslice, e.g. 1000fps = average 0.5ms scanout latency. You can have multiple tearlines per refresh cycles, as seen in Are There Advantages To Frame Rates Above Refresh Rates?
- For VSYNC OFF lag tester on most signal=panel synchronized scanout, you have a TOP=CENTER=BOTTOM effect (strobing off) and a TOP>CENTER>BOTTOM effect (strobign on, assuming no crosstalk).
- On a CRT tube, VSYNC OFF lag testers will show half a frametime of lag. So if your VSYNC OFF lag tester is spraying 1000fps, you'll get a 0.5ms number.
- This will also happen on ultrafast digital panels with subrefresh latencies, as long as the pixel can at least initiate its GtG pixel momentum at least to the stopwatch stop threshold. I've seen <1ms before, as long as you don't have much port transceiver delay (HDMI/DP transceiver latency can be <0.1ms if optimized properly).
- These lag numbers are good for esports players and anybody who uses VSYNC OFF
- You still have the same problem of reviewers not disclosing latency-stopwatch-stop (GtG 1% is always lower lag than GtG 99%).

Lowering Lag Numbers

This is metaphorically the most "conspiracy" I will get, but it's still 100% genuine science with 100% genuine numbers, completely explainable in display scanout physics;

The reason some do this is to filter out computer lags / GPU lags / scanout lags / sync technology lags, away from display-only lag. I understand the rationale, but I'd rather reviewers publish multiple lag numbers for multiple refresh rates / settings / variables -- And disclose each.

- Only lag test only the highest Hz. Some displays go really laggy at low Hz (worse than a lower Hz display), due to scan conversion (see information)
- Using VSYNC OFF lag testing method at highest frame rate your tester device/computer can output, because most VSYNC OFF lag testers do not subtract sync technology latency, and even VSYNC OFF has a scanout latency between two consecutive tearlines.
- Completely subtracting sync technology lag (e.g. using scanline #1 display top edge of VSYNC ON, or using first scanline right below a tearline at VSYNC OFF), or simply mathematically subtracting it (e.g. half a frametime latency is kosher, as long as disclosed that it is an ultra-high-framerate VSYNC OFF lag tester, and you're excluding sync-technology / scanout lag in your lag numbers).
- Partially accidentally/intentionally excluding HDMI/DP port transceiver latency (e.g. sensor only measuring display-end video input tranceiver, not sensor-side video output tranceiver)

Lag stopwatch "start late" methods

...Timestamp at end of VBI for a VSYNC ON lag tester (*Note: Windows does start-of-VBI Present() unblock, so you're also measuring lag of VBI prior to the next refresh cycle, so you're measuring that additional sync-technology-derived lag)
...Timestamp only right after Present()+Flush() for a VSYNC OFF lag tester connected to a PC application
...Intentionally beam-racing the VSYNC OFF tearline position, ala Tearline Jedi, to intentionally position tearlines right above photodiode sensor, basically Present()+Flush() right before the raster beam physically reaches the scanout position of the sensor photodiode.

Lag stopwatch "stop early" methods

...Using low GtG% threshold (applies to both VSYNC ON and VSYNC OFF lag testers)
...Intentionally positioning sensor on first scanline of new frame (top of screen for VSYNC ON, top of frameslice for VSYNC OFF, right underneath a tearline)

Frankly, simply put, I complain about reviewers not disclosing their lag stopwatch start and lag stopwatch end. People often want to measure lag only of display, since you still have to use an adaptor to connect to a CRT, so you still have HDMI/DP port transceiver latency if you connect a digital output to a CRT. So sometimes some reviewers want to filter the port-transceiver latency (HDMI/DP codecs). Other times reviewers don't bother and just measure Present()-to-photons. So you're measuring transciever1(GPU output)+tranceiver2(display input)+display latency. It's still a Holy War raging behind the scenes. But it is what it is, my beef is PUBLIC DISCLOSURE. Personally, I just want MOAR DISCLOSUREZ.

Major Errors

I've seen the Leo Bodnar sensor (and other testers) malfunction giving abnormally low or high numbers, due to refresh-cycle aftereffects or pulsing behaviors that produces red herring effects. For example, DLP projectors have a color wheel and each DLP pixel flickers at up to 2880 Hz (1-bit modulation), and some testers go wonky with that. It's possible that sensors screwup on some displays.

Very rarely, refreshing sheninigians in displays (like the faint pre-strobe during Eizo Turbo 240 on FG2421 years ago, for a 120Hz panel that is brief-long double-strobed) will trip up the testers. The pre-strobe was sub-1% crosstalk, but artificially (possibly accidentally) lowered lag numbers by up to a full refresh cycle, despite most of the photons coming from the second strobe.

Conclusion

- Lag testing is a giant rabbit hole, sometimes full of politics. I prefer to science this out.
- TL;DR Testing process disclosure (lag stopwatch start + lag stopwatch stop) is the correct tree in the correct forest to complain / criticize about
- 0.3ms numbers are actually realistic, if you optimize the lag testing methodology maximally as above. Outputting >2000fps VSYNC OFF to a CRT will generally achieve this for any scanline. Same for a near-zero-latency scaler/TCON (with only 2-6 scanline (pixel rows) rolling window picture-processing buffers. It can also be achieved via input-lag-measuring top edge only on a VSYNC ON lag tester that does end-of-VBI latency stopwatch-start and has zero queue (waitable swapchain techniques or force-flush techniques).
NOTE: Even flush latency is still usually at least ~0.1ms-0.2ms on most GPUs though, if you're using a hyperpipelined GPU to render a latency test

I find it mighty annoying about the lack of full disclosure about latency stopwatch start/stop, but I have long wanted to start an initiative about latency stopwatching disclosure, because GtG starts depends on whether you're using VSYNC OFF.

Educate Responsibly

Please consider this rabbit hole when reposting in other forums. I do not like it when people spread misinformation on other forums without understanding science/physics of this.

Lag Rabbit Hole Permalink:

Code: Select all

https://forums.blurbusters.com/viewtopic.php?f=10&t=12875&p=100495#p100495
Share the "sciencing it out" real information instead of "manufacture tinfoil hats" fake information.

Thank you for being a responsible member of the Internet. And you're welcome.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Kvoyn0v
Posts: 10
Joined: 02 Jan 2024, 10:22

Re: Critiquing Reviewers Testing of Input Lag

Post by Kvoyn0v » 06 Jan 2024, 04:53

r0ach wrote:
05 Jan 2024, 19:28
But then after using the monitor for an hour or so you notice it has some eye strain problems (weird effect I haven't seen on other panels where it makes you want to open your eyes to maximum while looking at it). On monitors like the XG27AQMR it doesn't have PWM by default but if you turn HDR on for that monitor it gives you PWM at brightness levels of 225cdm2 or lower. It's possible the weird eye effect on the Koorui is due to the "HDR: Auto" setting causing some type of PWM, but if you turn it off (with my 7800xt setup at least), cursor movement goes from the most light and linear you've seen to clown cursor.

The image quality is also not that great compared to the other 5 or 6 1440p 240hz panels I've tested (might be fixable with calibration, not sure). Tones that are suppose to be light and pop out on an image will be flat, etc, and doesn't help that the panel is also probably only doing 950-1000 contrast while many other 1440p's are doing 1350 now. So, TLDR, if you use stripped AMD drivers and want the best aim in the world (27g1s also has good mouse movement but even more eye strain on my setup), then buy the Koorui. It's kind of a creeping effect where I can personally look at the thing for an hour or so before you start noticing it's getting annoying. Only new panels I've tried that don't seem to have these eye problems are the BOE's which are slower than AU panels and generally don't look as good as either a good LG or AU.
I’ve got all your same problems. The floaty cursor, the lack of clarity when moving the camera, seemingly weird scaling and draw of distance, lack of detail, etc. on both consoles and PC and with several monitors. I really don’t even know how someone could conclude whether something is a monitor or computer issue. There seems to be zero consistency whatsoever to how my graphics look and changing almost any setting in the PC or monitor temporarily relieves it. Often times plugging in a lamp nearby or USB cable into my PC will instantly make the display look amazing, clear, vibrant and so easy to snap onto targets when looking around in FPS game.

User avatar
r0ach
Posts: 95
Joined: 10 Oct 2023, 14:45

Re: Critiquing Reviewers Testing of Input Lag

Post by r0ach » 06 Jan 2024, 23:15

Kvoyn0v wrote:
06 Jan 2024, 04:53
I’ve got all your same problems. The floaty cursor, the lack of clarity when moving the camera, seemingly weird scaling and draw of distance, lack of detail, etc. on both consoles and PC and with several monitors. I really don’t even know how someone could conclude whether something is a monitor or computer issue. There seems to be zero consistency whatsoever to how my graphics look and changing almost any setting in the PC or monitor temporarily relieves it. Often times plugging in a lamp nearby or USB cable into my PC will instantly make the display look amazing, clear, vibrant and so easy to snap onto targets when looking around in FPS game.
This looks like an AI written gaslighting post from Asus or Acer damage control or you've mistaken me for someone else because I don't have any weird electrical issues like that. I just notice that after using like six 1440p 240hz panels that some of them feel much more responsive than others even though all these reviewers come up with input lag numbers saying they're all virtually identical.

The Innocn 27G1s, the Koorui 27E3QK, even the Innoview 240hz panel using an overclocked Samsung G51c panel all seem to have more responsive cursor movement than the name brand panels I've been trying from Acer (xv272u W2) and Asus (VG27AQML1A). They all seem to have high eye strain except a few BOE panels, but that's another story.

What is the root of the problem? I don't know. Probably too much feature creep bloat firmware + bad code at the same time. I've always had good mouse movement with Samsung monitors but they now have nothing to buy unless you want a curved G7.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Critiquing Reviewers Testing of Input Lag

Post by Chief Blur Buster » 07 Jan 2024, 00:14

r0ach wrote:
06 Jan 2024, 23:15
This looks like an AI written gaslighting post from Asus or Acer damage control
While I've seen manufacturers do spin control (thankfully very rarely), I checked the fingerprinting/trace and it doesn't seem to be (AFAIK). I'm very familiar with spincontrol patterns.

Just adding supplementals, to help lay down those usual torches and pitchforks. Sure, I welcome them occasionally when I play Refresh Rate Robin Hood, but whoa, this is not the time.
r0ach wrote:
06 Jan 2024, 23:15
I've always had good mouse movement with Samsung monitors but they now have nothing to buy unless you want a curved G7.
Samsung now is exhibiting 360 Hz QD-OLED at CES 2024, and LG is now exhibiting 480 Hz WOLED, both of which massively outperform LCDs and from some preliminary tests I've been hearing, are now finally viable contenders for esports players who need 1440p.

Even www.testufo.com/map is now performing CRT motion clarity at 960 pixels/sec, with tack-sharp 6 point text, which is just frankly impressive. Mathematically, at GtG=0 and pps=Hz, you have only half a pixelwidth motion blur at leading/trailing edges (the even leading/trailing split in Blur Busters Law "1ms of persistence = 1 pixel of motion blur per 1000 pixels/sec"). And even with those half pixelwidths, you've got the blur gradient so the center of half pixelwidth is quarter pixel width. So you can double motionspeeds, e.g. 960 pixels/sec for 480Hz, and still be pretty clear (middle of a 2 pixel blur transparent-to-opaque gradient is 1 pixel, and that's evenly split at both leading/trailing edge). It's interesting how Blur Busters Law formula behaves virtually perfectly on OLEDs, but I hadn't realized the double-speed advantage until recently because of the blur gradient is more than half transparent.

So that's why OLED motion stays clear up to twice the Hz in pixels/sec motionspeeds, and 480Hz = is perceptually perfectly clear for all motion speeds up to 960 pixels/sec. Which, frankly, is amazing. I've never seen 960 pixels/sec look CRT motion clarity without flicker-based motion blur reduction.

And those between-refresh pixel reset brightness dip is even less (in duration and depth)! One diminished rare-ergonomic worry to worry about. WOOHOO. And, they have some wickedly incredibly crazy low VSYNC OFF latencies too that's finally getting closer to CRT. I hope that means they finally fixed the DSC (?), but I need to wait and find out for myself.

But boohoo, on those antiglare coatings. While glare can increase eyestrain for some, the antiglare texture can be an eyestrain medicine worse than the original glare. So, slightly dissapointed at that, as some of the eyestrain causes was traced to the antiglare film. That's why a few often did special antiglare-film removal treatments (YouTube search) and found it reduced their eyestrain.

Mind you, not all eyes can handle OLED, may not be for your eyes if you've tried each OLED flavour already. But I've lost count of people in my social media who had their eyestrain disappear after moving away from a 60Hz LCD and to a 240Hz+ OLED.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply