Presenting the ZOWIE XL2540 240Hz

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
StrobeMaster
Posts: 48
Joined: 25 Apr 2014, 01:31

Re: Presenting the ZOWIE XL2540 240Hz

Post by StrobeMaster » 05 Mar 2017, 06:32

b0t wrote:
StrobeMaster wrote:The monitor can run 240Hz with all three input types (HDMI, DisplayPort, DVI-DL), [...]
Ok thank you, but do I get any limitations being connected with DVI compared to DP? Beside DVI's lower pixel clock limit and the need to patch it?
Yes, DVI does not carry sound. But seriously, I don't know as I haven't played with the VT tweaks which could work differently for different input types. And then there is FreeSync, which should not be working at all with DVI but is botched anyway (when using DP or HDMI), so no disadvantage of using DVI here either.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Presenting the ZOWIE XL2540 240Hz

Post by Chief Blur Buster » 05 Mar 2017, 23:15

FYI -- On my BENQ XL2720Z, all inputs behave the same with Vertical Totals -- HDMI, DVI and DP

NOTE: HDMI doesn't "officially" support it, but it works. The XL2720Z works if you're doing HDMI 1.3+ and you're using an HDMI port (e.g. NVIDIA graphics card) capable of outputting >60Hz at 1080p. When it succeeds, it behaves like DVI. Also, a Dual-link-DVI-to-HDMI adaptor also works too (they now exist). It must be capable of 340MHz full bandwidth HDMI.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

bennzl
Posts: 4
Joined: 05 Mar 2017, 23:19

Re: Presenting the ZOWIE XL2540 240Hz

Post by bennzl » 05 Mar 2017, 23:21

What are the best settings people are using right now?

Q83Ia7ta
Posts: 761
Joined: 18 Dec 2013, 09:29

Re: Presenting the ZOWIE XL2540 240Hz

Post by Q83Ia7ta » 06 Mar 2017, 06:49

bennzl wrote:What are the best settings people are using right now?
240Hz

b0t
Posts: 100
Joined: 16 Jan 2017, 00:10

Re: Presenting the ZOWIE XL2540 240Hz

Post by b0t » 06 Mar 2017, 08:23

Chief Blur Buster wrote:FYI -- On my BENQ XL2720Z, all inputs behave the same with Vertical Totals -- HDMI, DVI and DP

NOTE: HDMI doesn't "officially" support it, but it works. The XL2720Z works if you're doing HDMI 1.3+ and you're using an HDMI port (e.g. NVIDIA graphics card) capable of outputting >60Hz at 1080p. When it succeeds, it behaves like DVI. Also, a Dual-link-DVI-to-HDMI adaptor also works too (they now exist). It must be capable of 340MHz full bandwidth HDMI.

I just asked something about the 2720Z and 2720 Zowie in the other thread, perhaps you would know this Chief, if the Zowie XL2720 carries the latest (or one of the latest, say at least V4) firmware of the 2720Z guaranteed (assuming I'll buy the Zowie) ??

I still cant decide between the two and it drives me nuts :p

EDIT : The TWO being the 2540 and the 2720(Z)

Comanglia
Posts: 44
Joined: 13 Oct 2014, 16:06

Re: Presenting the ZOWIE XL2540 240Hz

Post by Comanglia » 07 Mar 2017, 09:11

Paul wrote:While I'd agree with the ""pros"" (double quotation marks intentional) that G-sync isn't a point of interest to them due to somewhat increased input lag (though I call BS that anyone can feel the subtle difference) and the fact that at >100Hz tearing isn't a big deal, I would laugh at their faces if they said that "BBR/ULMB isn't what they need". If I were into super competetive gaming, I'd definitely "sacrifice" <1 frame of additional input lag for the huge benefit of greatly reduced motion blur. Especially when we're entering the 240Hz zone, <1 frame (0 ~ 4ms) of input lag isn't detectable by human senses by any means and if someone claims they can detect it, they're sitting too high on their MLG-branded horse and need to get down ASAP.

From the pictures posted in this thread I see that motion blur at 240Hz is still vastly worse than at 120Hz strobed, and not much of an improvement over 144Hz unstrobed. Is it technically possible to strobe a 240Hz panel without having loads of unwanted ghosting artifacts? I vaguely remember reading in some topic that for TN panels strobing anything higher than 120~130Hz will bring unavoidable multi-ghosting. Should I hold my breath or give up on the idea of 240Hz strobed panels that perform well? If the latter, I'll keep using the xl2720z cause I got addicted to ultra-clean motion in games. I tried playing on my friend's non-strobed monitor and I almost threw up at the blur he was getting, hence a 240Hz non-strobed panel will most likely feel similar.
How in the hell do you think someone can't notice 4ms of input lag?

Can you tell the difference between 60Hz and 75Hz? that's only 3.3ms difference between refreshes hell I can tell the difference between 120Hz and 144hz and that's only a difference of 1.4ms of refreshes, or even with mice 125hz vs 250Hz or higher is obvious and is only a 4-6ms improvement.

I'm not going to say you're wrong about 240Hz ULMB vs regular 240Hz for overall performance but to say no one could tell the difference between 4ms of input lag is a bit dubious.

User avatar
lexlazootin
Posts: 1251
Joined: 16 Dec 2014, 02:57

Re: Presenting the ZOWIE XL2540 240Hz

Post by lexlazootin » 08 Mar 2017, 05:55

Telling the difference between 60hz and 75hz is one thing, telling the difference between 4ms input latency delay is another.

Two very different things here. Although i can't find it right now, there was a small game/test that was made by flood (i think) that you could test the different mouse latencys. It became VERY difficult around the 10ms~ mark.

Comanglia
Posts: 44
Joined: 13 Oct 2014, 16:06

Re: Presenting the ZOWIE XL2540 240Hz

Post by Comanglia » 08 Mar 2017, 09:52

lexlazootin wrote:Telling the difference between 60hz and 75hz is one thing, telling the difference between 4ms input latency delay is another.

Two very different things here. Although i can't find it right now, there was a small game/test that was made by flood (i think) that you could test the different mouse latencys. It became VERY difficult around the 10ms~ mark.
I can tell the difference of input delay in Zowie VS logitech mice. Typically that only a difference of ~7ms, kinda hard to set that up where I can't be biased though.

b0t
Posts: 100
Joined: 16 Jan 2017, 00:10

Re: Presenting the ZOWIE XL2540 240Hz

Post by b0t » 10 Mar 2017, 07:32

Mice refresh difference from 125 to 500 & from 500 to 1000 is quite huge.. 60hz and 75hz screen refresh is also noticable (or have I got a super keen eye) Can tell the difference (not as easily though) between 120 lightboost & 144 on my VG aswell..

Can anyone confirm what I asked about the 2720Z & 2720 Zowie firmware?

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Presenting the ZOWIE XL2540 240Hz

Post by Chief Blur Buster » 10 Mar 2017, 16:28

Paul wrote:Especially when we're entering the 240Hz zone, <1 frame (0 ~ 4ms) of input lag isn't detectable by human senses by any means and if someone claims they can detect it, they're sitting too high on their MLG-branded horse and need to get down ASAP.
There are subtle effects to consider:

Don't forget the "cross-the-finish-line-first" effect / "race-to-the-trigger" effect in simultaneous-draw situations (e.g. shooting each other at the same time).

In Olympics sprints, beating the finish line 1ms earlier can mean you win the race.

So you don't need to feel the 1ms difference in order to win with 1ms.

In this situation, you've got skilled humans, some with almost the same reaction times (e.g. let's say, humans with average "X" ms reaction time competing against a human with a (X+1) ms reaction time).

If you can save 2 or 3ms of input lag in your hardware, that can give the slightly slower reacting human ("X" + 1ms) a statistical leg up on the faster reacting humans ("X" ms reaction time). Less input lag in your hardware subtracts from your human reaction time which are often well-matched in the rareified heights of eSports gaming. It will all vary a lot and skills play a huge role, so there's lots of fuzz factor, but over time, many matches and many competitions later, that few milliseconds lag advantage can eventually build up statistically.

Now, if we're talking about old engines like CS:GO and Quake, we run extreme framerates at extreme refresh rates, on a very tight button-to-pixels cycle which can be only a few tens milliseconds long totalled, in high-speed-camera tests. Shaving 2ms-3ms off that can mean more than 10% shorter button-to-pixels cycle in some overkill-systems-with-old-engines situations. That's quite a leg-up on your competitors in well-matched reaction-time situations, even if you cannot feel the 1ms.

Let's say, 5 champion players. Let's say on a specific reaction-time-tester (they vary a lot, due to various added lag) -- imagine these 5 champion players have average reaction times of 160ms, 161ms, 162ms, 162ms, 163ms in a specific reaction time tester. Now with this, 2ms can really give you a leg-up on this very tight spread of human-reaction times of the top leagues. Just like the top 5 Olympics sprinters can have a spread just milliseconds apart, crossing that finish line.

So even if you can't "feel" that 1ms, it can help you "race-to-the-trigger" first in simultaneous-draw situations.

In normal casual FPS gaming, you're competing with all kinds of people with all kinds of reaction times, so that lag advantage does not matter nearly as much.

Scientific studies and random tests on casual/random gamers have shown that already, 2ms lag usually doesn't matter for casual FPS.

But when it comes to the rareified heights of the upper leagues, well-matched reaction times, well-matched elite skills, and very low-lag pipelines (e.g. GTX1080's on CS:GO), that 2ms can begin to make a statistical difference in beat-to-finish-line-situations -- the milliseconds doesn't matter for a high school gym class race, but matters a lot for a 100 meter Olympics race.

If you're the one seeing that single frame sooner, your finger is hitting the fire button equally sooner in identical reaction time situations of the top professional leagues. Two competitors in an FPS game. They run into each other around a corner. They suddenly see each other simultaneously. They shoot at the same time. Both persons have exactly the same reaction time. Who wins? Yup -- input lag begins to statistically matter in these specific situations. Even 2ms. race-to-the-trigger, just like a 100 meter Olympics race. You don't have to feel the 2ms to win with 2ms.

Limitations in games (game tick rates, microstutters, mouse pollrate limitations, pauses from disk access, rendertime hiccups, etc) start to fudge around all the lag statistics quite a lot at the millisecond levels. Sure, there's rounding effects due to game tick rates and frame refresh cycles, but, your human eye retinas are still seeing refresh cycles 2ms sooner and your button press transmitted to the server -- might end up rounding-off to the previous tick cycle. Potentially fragging your enemy one full tick cycle before them, even if the tick cycle is closer to 10ms. (e.g. 64Hz or 128Hz tick rate). And depending on engine, that may mean sudden rounding-off to identical (0ms difference, unseen) versus occasional huge difference (1/64sec difference, visible to human eye). Which means the 2ms occasionally amplifies to human-visible differences based on rounding-off effects, depending on how certain games handles it. Humankind is full of unexpected surprises including the ability to tell apart 1ms strobing from 2ms strobing (LightBoost 10% vs 100%) and ability to tell apart a 500Hz vs 1000Hz mouse poll rate -- there's possibly hidden effects that occasionally amplifies 2ms input lag into something human-visible -- such as noticing muzzle flashes becoming visible 15ms earlier about 20%(ish) more frequently because you had 2ms less lag. But we're not talking about that here, really (it might not be happening, but again -- there are many 'unexpected effect' factors that are always continually being discovered) -- milliseconds can matter anyway even if not perceived due to the finish-line effect.

In modern graphics heavy games on midrange games, 2ms is easily totally lost in the noise of a very jittery 100ms buttons-to-pixel pipeline. Especially when matched against plainly casual gamers with highly variable reaction times. Over packet-jittery Internet connections?

But CS:GO with 4Ghz + 240Hz + GTX1080 + VSYNC OFF + SSD + LAN + >500fps ? And playing in the top leagues (few variances in reaction times)? This is when 2ms start statistically affecting results. Well-matched skill/reaction situations for simultaneous-draw situations, even if you cannot feel the 2ms.

Sure, most of the time, 2ms doesn't matter. But that neglects the fact 2ms is important in the 100 meter Olympics sprint race. And also in eSports situations where there are race-to-the-trigger situations of simultaneous-draw. Even if you can't feel the 2ms.

That said, many pro/eSports leagues does require identical hardware to prevent input-lag differences from giving anybody an advantage.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply