Looking for the best Competitive Gaming Monitor

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
User avatar
RealNC
Site Admin
Posts: 3756
Joined: 24 Dec 2013, 18:32
Contact:

Re: Looking for the best Competitive Gaming Monitor

Post by RealNC » 28 Apr 2014, 09:53

For League of Legends, any 60Hz monitor is enough.

Fast monitors are only important for games with very fast animations.

Note that there is some very hefty marketing going on in LoL. Professional players get paid quite large amounts of money in order to promote gaming monitors by claiming that they improve their performance in this game.

However, given that 120Hz monitors can be found quite cheap today, you don't need to stay on 60Hz. But you don't need to spend $400 for a monitor in order to play LoL competitively or even professionally.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Looking for the best Competitive Gaming Monitor

Post by Chief Blur Buster » 28 Apr 2014, 16:05

There are definitely human-visible reasons where 1000Hz can be preferable.
flood wrote:There's no way that 2ms vs 1ms of mousing position jitter is distinguishable when gaming, especially when network latencies, frame rendering times, server timings, etc... fluctuate by the same order of magnitude.
Although there are points of diminishing returns, it's definitely not zero. So I disagree. Some facts:
1. Network latencies has no bearing in local environment fluidity effects in many games (e.g. texture detail during strafing and turning)
2. Frame rendering times in some older engine games (e.g. CS:GO) do not fluctuate by an order of magnitude. Frame rendering times are extremely consistent in certain game engines with less than 1ms jitter. Source engine games are able to run at 300-600+ frames per second on my Geforce GTX Titan, and the frame rendertime variability between the majority of immediately _adjacent_ frames (>90%) are far less than 0.5ms. There ARE situations where this is not the weak link.

You can see a _local_ visual microstutter difference between 500Hz and 1000Hz. 4000 pixels/second fast panning (2 screenwidths per second) has a more erratic 8pixel-amplitude microstutter (2%) at 500Hz, and a smoother 4pixel-amplitude microstutter (1%) at 1000Hz; and my eyes can see it when the following assumptions are met:
(A) The game engine is not limiting factor (e.g. Source Engine on GTX Titan)
(B) computer is not limiting factor (e.g. i7 CPU with SSD)
(C) motion blur is not limiting factor in ability to see microstutters (e.g. using LightBoost which makes microstutters easier to see)
(D) doing actions that don't depend on network (e.g. panning motion that's not network-synchronized)
(E) mouse pad is not your limiting factor (e.g. high precision mouse pad).
The effect can be seen during either VSYNC OFF or VSYNC ON or GSYNC. (Note that stutterfree motion from GSYNC and fully-capped-out VSYNC ON can also amplify visibility of external microstutter factors such as 500Hz vs 1000Hz, however, the improvements are still visible during VSYNC OFF albiet not as strongly). Now, providing your system and configuration meets the above assumptions, three exercises are provided below to "see for yourself".

The clarity difference is also noticeable when dragging a web browser window around fast, while trying to read text within the browser window. (Make sure you have the Windows 8.1 mouse fluidity fix installed, otherwise dragging windows only occurs at a mouse rate of only 200Hz). To be able to read text, you need text to run as smooth as http://www.testufo.com/framerates-text ... But due to mouse Hz aliasing with refresh Hz, mouse dragging is never as smooth as that. But this is where 1000Hz is half as microstuttery as 500Hz, especially when motion blur is not hiding the microstutters.

EXERCISE #1: First, Witness Reference-Quality Motion Fluidity
Also known as the butter CRT smooth effect; witnessed in Nintendo-smooth Super Mario Brothers panning on a CRT; etc.
1. Turn on your strobe backlight (LightBoost), preferably at a lower persistence setting (e.g. LightBoost 10%)
2. Turn VSYNC ON temporarily. Yes, it adds lag, but this isn't what we're testing at the moment.
3. Load an older game such as Source Engine (using at least a Geforce 670 or faster)
4. Verify it is running at full frame rates at all times.
5. Strafe left/right using keyboard. (eliminates mouse as a source of microstutters)
6. Observe that motion is perfect fluidity (zero stutter, zero tearing, zero blur) similar to stutter-free TestUFO.com motion.
7. Define the above as 'reference quality' motion fluidity.
.....(zero microstutter, exact distance stepping between frames; perfect like fast scrolling in an 8-bit game)
7. This proof confirms that with some popular FPS games (e.g. CS:GO) you can reproduce perfect-fluidity motion under certain conditions (0-pixel microstutter). This provides the necessary environment to be able to detect mouse microstutters.

EXERCISE #2: Microstutter Comparision
1. Turn on your strobe backlight (LightBoost), preferably at a lower persistence setting (e.g. LightBoost 10%)
2. Load an older game such as Source Engine (using at least a Geforce 670 or faster)
3. Test the following
....a. Keyboard strafe left/right
....b. Mouse 1000Hz
....c. Mouse 500Hz
4. You will observe that from a microstutter perspective, a is better than b, which is better than c. (fluidity a > b > c)
5. This definitively proves mouse adds microstutters that keyboards do not

EXERCISE #3: Use Case Where 500Hz vs 1000Hz Makes a Visible Difference In Game
This is not applicable to competitive gamers who usually stare stationary at crosshairs on screen (sponsored/pro players tend to do this). This is more applicable to gamers who track eyes away from crosshairs.
1. Turn on your motion blur eliminating strobe backlight (e.g. LightBoost)
2. Load an older game such as Source Engine (using at least a Geforce 670 or faster)
3. Find high-detail screen.
4. Do a mouse turn, causing the screen to pan approximately one screenwidth per second (2000 pixel/second pan).
5. While the screen is panning, track your eyes on things that pass by from one edge of the screen to the other (e.g. like you would otherwise do with http://www.testufo.com#pps=1920 ...)
6. Repeat the above at 500Hz, and at 1000Hz.
7. Test the above with VSYNC OFF, then with VSYNC ON. If you have GSYNC, test with GSYNC too.
500Hz VSYNC ON
1000Hz VSYNC ON
500Hz VSYNC OFF
1000Hz VSYNC ON
Compare the microstutters to the reference quality motion (witnessed by keyboard strafing+VSYNC ON).
8. You will observe less microstutter with 1000Hz than with 500Hz during the 2000 pixels/second pan, especially in certain monitor modes (500Hz-vs-1000Hz visibility is amplified in strobed mode or GSYNC mode)
9. It is more noticeable at certain sensitivity settings (e.g. low game sensitivity / high mouse sensitivity)

2000 pixels/second is a motion speed intentionally chosen because it's a very fast screen motionspeed common in FPS games, but not too fast for eye-tracking (most gamers can still essentially perfectly track moving objects moving at ~2000 pixels/second in http://www.testufo.com motion test patterns). Since you can still count the number of eyes in the single-pixel-eyes of the TestUFO alien when LightBoost is enabled (motion blur no longer obscures the alien), any microstutter at all (even a 1-2 pixel amplitude) creates a jittery effect that obscure the visibility of single-pixel details (and in many cases 2-pixel-thick or 3-pixel-thick detail).

Now that you've done these exercises, and seen this for yourself, you can finally stop disagreeing with 500Hz vs 1000Hz. ;) For most gameplay tactics (e.g. staring at crosshairs) it is not going to help competitive scores, but the visual benefits of 500Hz vs 1000Hz ARE noticeably there, for people who are prioritizing visual benefits during eye-tracking situations (eye tracking of fast-moving on-screen objects).

TL;DR: You do not see the Hz itself, but you DO see the reduced microstutter beat-frequency effects between display Hz and mouse Hz when you eliminate all the other limiting factors.
flood wrote:well maybe I'm wrong and someone can prove me wrong with a blind test (not some anecdotal impressions which are subject to confirmation bias).
If you know Mathematics 101 and Physics 101, you'll understand the beat-frequency effects as well as I do. ;)

When you're tracking moving objects at 1000 pixels/second, with single-pixel-thick details like at http://www.testufo.com, you need 0-pixel-amplitude microstutter to accurately track moving single-pixel-thick objects at this motionspeed. If you already have a LightBoost display, you're already witnessing your ability to count the number of eyes in the TestUFO alien moving at a motionspeed near 1000 pixels/second. If you don't have LightBoost, you can see the photos at 60Hz vs 120Hz vs LightBoost, which shows that persistence can be lowered to the point where your eyes are able to track single-pixel-width objects travelling at 1000 pixels/second. This is where even a single pixel vibration (single pixel microstutter) actually begins to become noticeable.

If you have mathematical aliasing between mouse Hz and display Hz, you have microstutter that interferes with your ability to track single-pixel-width objects. Aliasing betwen 500Hz(mouse)-vs-120Hz(refresh) creates double-amplitude microstutter vibration as compared to 1000Hz(mouse)-vs-120Hz(refresh).

A 500Hz mouse at 1000 pixels/second, can have a 2 pixel aliasing error (1000/500 = 2). A 1000Hz mouse at 1000 pixels/second can have a 1 pixel aliasing error (1000/1000 = 1). Assuming the mousepad and sensor is not the limiting factor. Sensors and mousepads are getting better, so this isn't the issue. The 2 pixel aliasing error at 1000 pixels/second multiples to 4 pixels aliasing error during faster 2000 pixels/second motion. This frequently occur during FPS games during one screenwidth per second panning; often found in medium-speed 180-degree turns, and if you attempt to track your eyes on moving objects that pass by the screen while you flick at 2000 pixels/seconds, this is WHEN 500hz versus 1000Hz makes a visually noticable difference, but ONLY if your *track* your eyes on moving objects during a 2000 pixels/second mouse-flick (or other speeds that are fast, but not too fast for your eyes to track -- e.g. 3000 pixels/second). Gameplayers who like to shoot while turning fast, will find this relevant; but it depends on the gameplay tactics that you use; most pro players I observe, tend to use different targeting techniques than this, but it really all varies in the game and league you play. People who are used to CRT-optimized gameplay tactics (e.g. circle strafing) will see 500Hz-vs-1000Hz difference more easily than people who grew up with LCD displays utilizing different gameplay tactics (e.g. strafing while shooting faraway enemies).

If you run at a fixed frame cap (e.g. 125fps Quake Live), then 500Hz vs 1000Hz aliasing makes less of a difference since both are divisible by 125. However, if you let your framerate fluctuate/float, then 500Hz vs 1000Hz makes a more noticeable difference due to the reduced aliasing/beat-frequency effects between mouse Hz and display Hz.

One needs to have an undertstanding of high school or undergraduate University math/physics, in order understand the aliasing effect (and how it actually co-relates to visual observations). This is why 500Hz vs 1000Hz can actually begin to make a visually noticeable difference in the modern era of strobe-backlight displays (LightBoost) of today. Sometime this year, I will likely draw some diagrams and an upcoming Blur Busters article to explain microstutter mathematics better. It is so simple, that this is indisputable.

TL;DR: If you do not understand "aliasing" or "beat-frequency" effects as a general math concept, you won't understand the science of why 500Hz versus 1000Hz mouse actually can make a visible difference on modern gaming displays.

Bottom line -- it may not matter to competitive gamers that use certain "pro" gameplay tactics (e.g. strafing until faraway enemies slowly pass under crosshairs) that do not depend as much on motion clarity/fluidity. And 1000Hz accuracy *can* degrade compared to 500Hz on some mice. It may be more important to run at 500Hz if your specific game, setup, and gameplay tactics warrants it. Elite competitive gamers *do* recommend 500Hz from time to time, for a reason. However the recommendation is getting less strong in the 120Hz era because motion fluidity definitely noticeably suffers with strobed/GSYNC/120Hz+ when using only 500Hz than with 1000Hz. Generally, it's really best to go with 1000Hz when using the newer display technologies that eliminate themselves as the microstutter weak link (e.g. motion blur hiding microstutter, or asymmetry between refresh rate where microstutters are easier strobing/GSYNC because of reduced visibility of microstutter. There ARE now certainly situations (when prioritizing tracking motion fluidity), that 1000Hz is very noticeably better than 500Hz.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Looking for the best Competitive Gaming Monitor

Post by Chief Blur Buster » 28 Apr 2014, 17:47

flood wrote:
Q83Ia7ta wrote:Many competitive gamers notice input lag causing by Lightboost and it's only 4ms but noticeable.
I'd take that with a grain of salt.
It's small, often not felt, but there's a legitimate explanation why it can affect frag scores:

If you're aiming the crosshairs very fast, e.g. 2000 pixels/sec, a 4ms sudden increase in lag (compared to the display you trained on) means you overshoot your target by 8 pixels (2000/4 = 8) compared to your normal aiming. Which means you have to correct, by mousing back and fourth (like swerving a steering wheel back and fourth when you're skidding) until your crosshairs is lined up. The more lag delta compared to what you trained for, the more back-and-fourth correcting you have to do, until you're retrained for that specific latency. Certainly, 4ms differential lag is easy to re-train for, so you can fix your muscle memory. But that's not the only factor. Crosshairs aiming lag is largely unaffected by network latency, and if your frame display times is consistent (e.g. CS:GO), the sudden 4ms change becomes the primary latency 'error' factor.

Highly trained professional game players notice when their lag differential changes, because their fast-aiming ability gets screwed up, and they need to retrain their muscle memory for a specific display latency. In games where frame rendertimes are so consistent, ~5ms changes in latency actually become noticeable in terms of lengthened back-and-forth correcting in your aiming. You don't feel it directly but aiming times becomes longer because your muscle memory is not used to the latency changes.

People who have trained from the very beginning on the brighter models of "LightBoost" displays, and compete in arenas that permits LightBoost displays, will have a better competitive advantage than a competitive player that toggles it on, if they are playing games that demand low motion blur (e.g. tracking camoflaged enemies passing by at high speeds). So strobe backlights can in certain cases give a competitive advantage that outweighs that latency. However, consider the competitive gamers that keep their eyes stationary on crosshairs (You only notice display motion blur while moving eyes around, which means moving eyes away from crosshairs -- http://www.testufo.com/eyetracking as an example of this effect of no observed motion blur when eyes are stationary); so stationary staring at crosshairs will not notice the benefits of LightBoost, and would instead feel only the input lag degradation and brightness degradation. From this perspective, some pro elite competitive gamers are not advocates of LightBoost at all. This is understandable; blur reduction is a good tool when properly used for the proper game and proper training, (and competing against people that won't give you a disadvantage during blur reduction).

On the other hand, if you walk into a competition that uses strobe-capable monitors, this may be an opportunity to gain an advantage over other players who don't train with strobing. Some competitions use strobing from the start (e.g. Eizo FG2421), as long as you've trained to compensate for its input lag and take advantage of its benefits, it may help, but it is a decision you will have to make whether or not it is beneficial.

Now, there's also the "cross the finish line" effect. Both players shoot at each other simultaneously. The guy with 4ms less lag, will be the one who wins that simultaneous draw. So you better damn very well be very sure that the motion clarity advantages (faster human reaction time) outweighs the minor lag disadvantage. e.g. Being able to react faster without motion blur. Not all games provide enough "win" opportunities from the elimination of motion blur, so choose your motion blur reduction carefully in your competition training since you're going to be perpetually handicapped by 4ms relative to other users playing the same equipment, when people have a fast <~150ms reaction time that only varies slightly in many situations, the 4ms advantage is a huge chasm when you're competing with the top 10 players -- mere milliseconds difference in trigger time (during simultaneous draws) separate them!

Be warned that elite players such as Fatal1ty has already said LightBoost don't benefit them, due to color degradation, brightness degradation, and the added input lag. Especially when playing in brightly-lit convention centers where LAN parties and competitions are held; you need blazing monitor brightness for those environments. That said, newer strobe backlights don't have as many problems as LightBoost does -- e.g. the color problems or brightness problems, so strobe backlights are becoming more of a viable tool, but they must be used carefully, for the correct games, and trained for carefully (e.g. trained on and used in the competition) -- making sure everything else outweighs the 4ms disadvantage. Strobe backlights (LightBoost and newer tech) makes a big difference for a lot of things (especially models that are brighter, have adjustable gamma, and better color than LightBoost). But when you hit the elite leagues (the rarefied field that players like Fatal1ty plays in) you may be hit with a disadvantage if you enter a competition and the monitors don't have strobing capabilities, competing against players that aren't using strobed monitors. It's a nice feature to have around, as long as use it wisely.

TL;DR: You may not feel the 4ms directly, but in actuality the 4ms input lag deltas actually matters a huge deal from a "muscle memory perspective", and to a lesser extent the "cross the finish line" effect (this matters more in the elite *paid money* leagues).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

DaveKap
Posts: 16
Joined: 17 Dec 2013, 17:39

Re: Looking for the best Competitive Gaming Monitor

Post by DaveKap » 28 Apr 2014, 18:13

I'm not as technically minded as the smart folks posting thorough responses and I may not be a competitive gamer either; I will however make one suggestion as a person who recently had to return his TN produced monitor in favor of an IPS monitor.

MOBAs like LoL are dependent on the player being able to distinguish what's going on during hectic moments. LoL does a very good job at ensuring the players, their spells, and the environment are all of a separate color and theme in order to keep distinguishability high. DotA and HoN did not do this very well at all. It's something Blizzard takes care of in StarCraft and Valve took (until hats) care of in Team Fortress 2. Certain developers understand the importance of "silhouettes" and Riot is one of them.

However, when you play on a TN monitor - of which almost all over-60hz monitors are - it is likely that the colors will be washed out of their 32-bit natural beauty down to a 24-bit desaturated point. Some people don't notice, some people don't care, some people think the benefit of higher hz is worth it over poorer viewing angle and grosser color banding. I'm of the opinion that accurate colors are more important than 120hz+ framerate and I would keep that opinion if I were a competitive gamer. Whether you share that opinion or not is something you'll need to discover for yourself so the only real advice I can give is as follows:
Buy a monitor that you know has a good return policty. Newegg had a no-questions-asked policy on the one I bought so at the end of the day I paid zero cents to test and return the 144hz BenQ I wanted to badly. If you can manage to get this kind of policy on the monitor you're looking for, snatch it up immediately. There is no such thing as a "best" competitive monitor when your ability to discern what's on the screen is completely subjective. Once you know which side of the TN/IPS spectrum you fall on, you can boil down to which monitors really are the best. Be careful and enjoy the experience!

Edit: Just to be thorough, I checked Newegg to remind myself what their free returns policy was. Turns out they've converted it over to their "Newegg Premiere" program, which is a membership-based program. However, they do have a free 30-day trial for that right now, so you could still manage to get through a few monitors before that runs out. I'll leave you to research other places with good return policies. Just make sure they don't force a restocking fee. That's where the hidden fees live.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Looking for the best Competitive Gaming Monitor

Post by Chief Blur Buster » 28 Apr 2014, 18:45

DaveKap wrote:MOBAs like LoL are dependent on the player being able to distinguish what's going on during hectic moments. LoL does a very good job at ensuring the players, their spells, and the environment are all of a separate color and theme in order to keep distinguishability high. DotA and HoN did not do this very well at all. It's something Blizzard takes care of in StarCraft and Valve took (until hats) care of in Team Fortress 2. Certain developers understand the importance of "silhouettes" and Riot is one of them.
Excellent point. Some games have super saturated colors where TN color quality doesn't matter very much in the competitive leagues -- to the point where the high refresh rate really benefits. Other games have hugely subdued color where the IPS color can significantly improve. The games such as DotA obviously don't benefit as much from high refresh rate and strobing as does Team Fortress 2 (fast movement & highly saturated colors)

Be noted that many tournaments don't let you bring your monitor, but provide monitors (to level the playing field). If the tournament is doing DotA on the ASUS VG248QE, then you want to train for the color handicap anyway, so you're prepared when you enter the sponsored professional competition with the real-money pot.

Obviously, you want to use the right tool for the right job.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Q83Ia7ta
Posts: 761
Joined: 18 Dec 2013, 09:29

Re: Looking for the best Competitive Gaming Monitor

Post by Q83Ia7ta » 29 Apr 2014, 21:39

Chief, QuakeLive have 250fps limit from December 2013.

Q83Ia7ta
Posts: 761
Joined: 18 Dec 2013, 09:29

Re: Looking for the best Competitive Gaming Monitor

Post by Q83Ia7ta » 29 Apr 2014, 22:01

Chief Blur Buster wrote: Bottom line -- it may not matter to competitive gamers that use certain "pro" gameplay tactics (e.g. strafing until faraway enemies slowly pass under crosshairs) that do not depend as much on motion clarity/fluidity. And 1000Hz accuracy *can* degrade compared to 500Hz on some mice. It may be more important to run at 500Hz if your specific game, setup, and gameplay tactics warrants it. Elite competitive gamers *do* recommend 500Hz from time to time, for a reason. However the recommendation is getting less strong in the 120Hz era because motion fluidity definitely noticeably suffers with strobed/GSYNC/120Hz+ when using only 500Hz than with 1000Hz. Generally, it's really best to go with 1000Hz when using the newer display technologies that eliminate themselves as the microstutter weak link (e.g. motion blur hiding microstutter, or asymmetry between refresh rate where microstutters are easier strobing/GSYNC because of reduced visibility of microstutter. There ARE now certainly situations (when prioritizing tracking motion fluidity), that 1000Hz is very noticeably better than 500Hz.
Talking about mouse polling 1000hz or 500hz is like talking about korean's ips 120hz and BenQ's 120hz. From my experience I can't notice any my gaming performance difference between Zowie FK at 1000hz (default and maximum possible) and Logitech g100s at 500hz (default and maximum possible) and at same time difference between 144hz and 120hz LightBoost (by strobe utility) is too huge for me at BenQ XL2411T. May be i'm so competitive player...not sure.

flood
Posts: 929
Joined: 21 Dec 2013, 01:25

Re: Looking for the best Competitive Gaming Monitor

Post by flood » 01 May 2014, 21:08

Chief Blur Buster wrote:For most gameplay tactics (e.g. staring at crosshairs) it is not going to help competitive scores,
that's what I meant i.e. no one would experience a change in gameplay performance because of switching to 1000hz.

I agree that it is observable under the particular circumstances you described, but it's hard for me to think of an in-game situation significantly affected by an additional 1ms of microstutter. unless you're looking really hard for it, it's not going to affect your enjoyment of the game or your performance.
similarly, I think no one would complain about the 1ms persistence of strobed monitors causing motion blur, but it's probably noticeable if you look hard enough

basically what i'm saying is: use 1000hz if you can, but if your mouse only goes up to 500hz, don't worry about it.
but in actuality the 4ms input lag deltas actually matters a huge deal from a "muscle memory perspective
I don't know where you got "huge" from. 4ms may affect performance (mostly before being accustomed to it), but there are surely more significant factors contributing to gamers' subjective comments on the input lag due to lightboost

One thing is that humans react more slowly to dimmer images.

mello
Posts: 251
Joined: 31 Jan 2014, 04:24

Re: Looking for the best Competitive Gaming Monitor

Post by mello » 02 May 2014, 06:30

flood wrote:One thing is that humans react more slowly to dimmer images.
Now, that is interesting ! Anyone can confirm that ? If true, i wasn't aware of that, but subconsciously i was thinking that something is up, and it might have something to do with brightness. I am using BenQ XL2420T (rev 2.0) for 3 months now and i am switching between 120Hz Lightboost and 144Hz from time to time, still trying to determine which one is better for me. 144Hz is really great overall but i think that my reactions are better with laggy/stuttery models when using Lightboost. It looks like motion blur screws up my reaction time, add to that laggy/stuttery models (caused by my connection) and most of the time it is impossible to tell where exactly enemy model is, so i don't know where hitboxes are. I think that Lightboost helps me with that to a certain degree, of course it can't fix a bottleneck which is my connection. Now, with Lightboost 10% i noticed that my overall reaction time is worse in comparison to 144Hz, i feel like i am much more responsive when using 144Hz refresh rate. Switching to Lightboost 50% or 100% definetly helps, so it would explain why i thought that my reflex is worse with darker and dimmer image.

flood
Posts: 929
Joined: 21 Dec 2013, 01:25

Re: Looking for the best Competitive Gaming Monitor

Post by flood » 02 May 2014, 20:16

Yes, but I have no idea how significant the effect is between 10% lightboost and full illumination
http://en.wikipedia.org/wiki/Pulfrich_e ... xplanation

Post Reply