Page 13 of 15

Re: Experience & Opinion: 240hz displays are blurry

Posted: 09 Jan 2018, 22:04
by lexlazootin
LOL that thread is terrible... I hate monitors, people misunderstand so much about them it’s terrible. Even with proof they ignore it.

Re: Experience & Opinion: 240hz displays are blurry

Posted: 09 Jan 2018, 22:14
by darzo
On for instance the Overwatch forum you'll read stuff like it's basically about input lag and all you need for that is fps, not a monitor. This in a 60hz vs 144hz context. A lot of people actually appear to focus rather exclusively on input lag, pros included, which drives me nuts. Input lag is just how long it takes for you to see an action on a monitor after the fact, and my impression is at higher refresh rates you can take an action at a more precise and exact moment (combining finer mouse movement with smoother object motion), which is much more valuable. It's a pain in the ass trying to argue unless you're really prepared and on top of it, which I lack at the very least the patience for.

Re: Experience & Opinion: 240hz displays are blurry

Posted: 16 Apr 2018, 01:32
by KindOldRaven
darzo wrote:On for instance the Overwatch forum you'll read stuff like it's basically about input lag and all you need for that is fps, not a monitor. This in a 60hz vs 144hz context. A lot of people actually appear to focus rather exclusively on input lag, pros included, which drives me nuts. Input lag is just how long it takes for you to see an action on a monitor after the fact, and my impression is at higher refresh rates you can take an action at a more precise and exact moment (combining finer mouse movement with smoother object motion), which is much more valuable. It's a pain in the ass trying to argue unless you're really prepared and on top of it, which I lack at the very least the patience for.
What makes me laugh is all the kids thinking it actually makes 1% of difference at their skill level whether a panel has 1 or 2ms more latency or not :lol: My standard response is always kind of harsh, but true none the less, and something to the extent of 'you don't actually believe you're that one-in-a-million special snowflake who can actually notice that difference to the extent that it'll actually make you go up a rank or two, do you? You think you'd stand a chance against a professional player when you're using the fastest 240hz panel @ 500 fps locked with him using a simple 60hz panel? Really?' which is triggering, of course. But at times it makes them think.

I mean we all know it matters, but the herd-mentality of try-hard competitive gamers really overblows that stuff. Practise and talent is so much more important than any tech advantage, besides perhaps 20+ ms input lag in fast shooters or *extreme* motion blur, which would actually instantly impact people to the point of playing way worse than they normally could.

Re: Experience & Opinion: 240hz displays are blurry

Posted: 16 Apr 2018, 01:49
by Chief Blur Buster
Around here, milliseconds matter -- so no matter how silly it sounds, we respect the unexpected -- so I don't namecall those who fuss about the millisecond.

Even when a human cannot feel the millisecond, there's also the "cross-the-finish-line" effect. In a 100 meter sprint, finishing just a tiny bit ahead (even if you don't notice until you see high speed video replay). The "see each other, react at same time, frag each other at same time".

With near-identical human reaction times, the input lag of equipment can be the deciding factor of a specific reaction-time win.

The "It seems like my reaction time seems better with this lower-lag setup" is a powerful effect even if one can't always feel the millisecond or few directly. When competing in professional leagues, the reaction time spread is tighter, so tiny lag differences matter more. This affects statisticals wins in their favour.

Whether or not it is placebo, I still respect the millisecond.

P.S. I offer incentives/payment for peer reviewed science studies on eSports competition times, I commissioned a researcher (spacediver) to do a little (informal) research on this -- Foreword | Page1 | Page2 | Page3 .... More intensive rigorous study is desired & needed.

In other contexts (e.g. persistence), it unexpectedly matters too. People have also read about how a millisecond affects the optical illusions at Eye Tracking and Persistence Of Vision animations. These are motion-blur-optical-illusion animations that utilize the formula "1ms of persistence = 1 pixel of motion blur per 1000 pixels/second" in order to create precisely-generated motion blur illusions.

And, we're already familiar with that 1000Hz mice feel (generally, in many metrics) smoother than 125Hz and 500Hz. And also, Microsoft's tests with 1000Hz touch screens which actually made a visual difference.

What I am saying, is milliseconds matter in more ways than expected, even if we don't "notice or feel" the millisecond, sometimes even to the average user. So I have come to respect thy millisecond, even if my guts tell me not to always believe it.

Certainly there is a lot FUD/noise though, and misinformation! But I know enough not to kid around the millisecond (whether it be lag, display artifacts, absolute lag, differential lag, screen response, persistence, poll rate, stutter, frame pacing, etc). So, around here, we are nice to people who fuss about milliseconds.

Re: Experience & Opinion: 240hz displays are blurry

Posted: 16 Apr 2018, 02:05
by RealNC
Opinions are split. Personally, today I consider 1 or 2 milliseconds completely inconsequential, even in the highest possible level of competition. All the "finish line" instances are so rare as to not matter. I'd still be interested in a scientific analysis on how often the "finish line effect" occurs though. I suspect the answer is "virtually never", but that's just me.

Re: Experience & Opinion: 240hz displays are blurry

Posted: 16 Apr 2018, 02:26
by KindOldRaven
Chief Blur Buster wrote:Around here, milliseconds matter -- so no matter how silly it sounds, we respect the unexpected -- so I don't namecall those who fuss about the millisecond.

Even when a human cannot feel the millisecond, there's also the "cross-the-finish-line" effect. In a 100 meter sprint, finishing just a tiny bit ahead (even if you don't notice until you see high speed video replay). The "see each other, react at same time, frag each other at same time".

With near-identical human reaction times, the input lag of equipment can be the deciding factor of a specific reaction-time win.

The "It seems like my reaction time seems better with this lower-lag setup" is a powerful effect even if one can't always feel the millisecond or few directly. When competing in professional leagues, the reaction time spread is tighter, so tiny lag differences matter more. This affects statisticals wins in their favour.

Whether or not it is placebo, I still respect the millisecond.

P.S. I offer incentives/payment for peer reviewed science studies on eSports competition times, I commissioned a researcher (spacediver) to do a little (informal) research on this -- Foreword | Page1 | Page2 | Page3 .... More intensive rigorous study is desired & needed.

In other contexts (e.g. persistence), it unexpectedly matters too. People have also read about how a millisecond affects the optical illusions at Eye Tracking and Persistence Of Vision animations. These are motion-blur-optical-illusion animations that utilize the formula "1ms of persistence = 1 pixel of motion blur per 1000 pixels/second" in order to create precisely-generated motion blur illusions.

And, we're already familiar with that 1000Hz mice feel (generally, in many metrics) smoother than 125Hz and 500Hz. And also, Microsoft's tests with 1000Hz touch screens which actually made a visual difference.

What I am saying, is milliseconds matter in more ways than expected, even if we don't "notice or feel" the millisecond, sometimes even to the average user. So I have come to respect thy millisecond, even if my guts tell me not to always believe it.

Certainly there is a lot FUD/noise though, and misinformation! But I know enough not to kid around the millisecond (whether it be lag, display artifacts, absolute lag, differential lag, screen response, persistence, poll rate, stutter, frame pacing, etc). So, around here, we are nice to people who fuss about milliseconds.
You make fair points as always, Chief :) I was purely referring to, especially younger, gamers who act as if they simply cannot play competitively on anything but the highest refresh rates, fastest monitors pixelresonse-wise etcetera and trash-talk every other option there is... while they still haven't been able to climb out of the most mediocre of ranks in competitive gaming. That kind of sheep'ish talk is triggering to me. I've played at 'decent' competitive levels (amateur, of course) on an old 60hz flatron while 144hz was the standard for a few years already and got accused of cheating so many times it wasn't even funny anymore at one point. And I'm not even that good. Sure I could've been better with decent gear back then, absolutely. Especially the jump from an old LG Flatron to any decent 120hz+ monitor would've improved my play, but I'm not going to act as if it made the difference between me being above average or pro, like many on forums like reddit or game-specific forums do.

I guess what I'm trying to say is: wanna-be competitive players (both meant positively and negatively...) should not forget that genetics (or 'talent', whatever you want to call it), and even more important, practice are an even bigger factor before you get to that level where this stuff becomes exponentially more important (top level play). I've seen people simply 'give up' because they believe their gear is holding them back so much... and if you've got a half decent 120hz+ panel and are running a locked 120+ framerate, that should simply not be the case in my opinion. Some young ones should spend less time reading about the game and hardware they're playing/using and spend that time actually trying to improve their skill and be honest with themselves instead of trying to find every excuse there is for their lack of improvement.

Having said that, I'd love for more people to frequent this website and use its tests (I refer to BlurBusters on the daily in discussions) because there's much to learn here for most of us, including myself. Every-single-day I see people tell others to avoid Gsync/Freesync because of the 'insane amount of input lag it induces, completely unusable for competitive PUBG' and stuff like that... :roll: Ah well, I'm very happy to have found this community :D

Re: Experience & Opinion: 240hz displays are blurry

Posted: 16 Apr 2018, 04:54
by lexlazootin
I also very much agree with RealNC, you can't feel 1-2ms but it doesn't mean i don't try and remove these variables whenever possible. If i could be using a 8khz polling rate mouse with <0.1ms debounce i would regardless if i could feel it or not. Latency adds up quick.

I respect the millisecond and at the same time i don't think you can feel a difference :lol: Way to sit on a fence.

Re: Experience & Opinion: 240hz displays are blurry

Posted: 16 Apr 2018, 06:54
by mello
KindOldRaven wrote: What makes me laugh is all the kids thinking it actually makes 1% of difference at their skill level whether a panel has 1 or 2ms more latency or not :lol: My standard response is always kind of harsh, but true none the less, and something to the extent of 'you don't actually believe you're that one-in-a-million special snowflake who can actually notice that difference to the extent that it'll actually make you go up a rank or two, do you? You think you'd stand a chance against a professional player when you're using the fastest 240hz panel @ 500 fps locked with him using a simple 60hz panel? Really?' which is triggering, of course. But at times it makes them think.

I mean we all know it matters, but the herd-mentality of try-hard competitive gamers really overblows that stuff. Practise and talent is so much more important than any tech advantage, besides perhaps 20+ ms input lag in fast shooters or *extreme* motion blur, which would actually instantly impact people to the point of playing way worse than they normally could.
This is so true. But there is another side, which is hardly mentioned or discussed anywhere. There is a BIG difference between playing on LAN at the highest/pro level and playing via internet (at all skill levels). People put too much focus on what pros are actually doing, what gaming gear they have, and what setting they are using. When in reality it doesn't matter, not in the slightest. Skill is the most important factor when you have the same playing field and all things are equal (LAN, same monitors, same hardware and virtually exactly the same gaming gear used). The world that pros operate in and the world where casual (including competetive) players operate are two completely different beasts.

When playing on LAN (pro level), what matters the most is an actuall level of performance that player has on a certain day and hour, and this thing varies greatly, and because of that you can have better and worse matches, and better and worse performances during LAN events (few days of matches). You also need to combine it with things that happen during games/matches, timing, decision making, your teammates actions (that are ultimately beyond your control and can still have an effect on your performance) all of it matters in the grand scheme of things.

Now, LAN vs The Internet, this is where things are being vastly misunderstood and underestimated. Internet performance varies all the time, at least af far as gaming packets (UDP) are concerened. Your gaming can be affected negatively by network congestion and network performance fluctuations, even if your ping hasn't changed. I am sure everyone has heard about hit registration problems and input lag related problems in FPS games, all of it is a direct result of network performance fluctuations. This is why people sometimes experience hit reg / input lag problems, and when they play few hours later their performance is much better and more consistent because these problems seem to have disappeared.

When playing via internet, apart from players skill, network performance is the single most important factor when it comes to FPS gaming, it is so important that everything else becomes really irrelevant. And by everything else i mean, mouse, monitor, playing at 60fps@60Hz, having low end pc (low fps), tearing, stutter, motion blur. All of it doesn't really matter if player has great skill and great network performance (low ping and perfect or near perfect hit reg). Example:

1) high skilled player will play at the same level, even with 60fps@60hz, low end pc (low fps), random mouse, as long as he has low ping and great hit reg. Your eyes can adjust to 60fps@60Hz (even when you change from 144fps@144Hz), and you can get used to random mouse, tearing, motion blur and stutter, in the end it will not affect your performance to a significant degree as long as you have low ping and great hit reg.

2) high skilled player will not be able to play at the same level, when he has really bad network performance (bad hit registration / input lag like problems) even if he has best hardware in the world at his disposal. Bad network performance can skill cap you so bad, that you can sometimes go from being elite/high skilled to seem being average skilled (when gaming via internet). Bad network performance may render gaming unplayable, regardless of what your hardware is. It may make your gaming not enjoyable and very frustrating, you may feel like you are handicapped.

Few things to understand. Network performance and UDP packet handling is not constant, it changes all the time, every case is different and some people are affected by network congestion and network performance fluctuations more than the others, and some people can enjoy perfect or near perfect network performance without any bad effects on their gaming performance. So, in the end you have different worlds when pros and casual/competitive players are concerened, but the same thing applies to playing via internet, some people may seem to be better players than they actually are, and some people may seem to be less skilled, just because of their inconsistent network performance.

All of these factors escape the public eye and understanding. In the end, gaming gear has its place and usefulness but there are much more things that are far more important when we are talking about optimal performance in FPS games.
Chief Blur Buster wrote:Around here, milliseconds matter -- so no matter how silly it sounds, we respect the unexpected -- so I don't namecall those who fuss about the millisecond.

...
It all depends on the context. Everyone wants to seek perfection, so in reality every improvement (even marginal) matters. For example, increased perceived smoothness due to G-SYNC and/or higher fps/hz may not give you a tangible difference in performance and results in the game (ignoring lag difference for a second...) but it can definetly make you feel better and make gaming more enjoyable and 'real' without certain artifacts.

When it comes to gaming, everyone wants to have best possible experience and performance, including every possible advantage they can get over others (if possible). So in that context every 1ms matters. Also, everyone wants to know that they have done everything they could at the time to optimize things that in the end may or may not affect their results in the game. It it always better to have for example 20ms input lag, than 25ms input lag, even if you can't feel it (or even AB it).

Even a small or marginal improvements in tech are still progress, and its all that matters in end.
RealNC wrote:Opinions are split. Personally, today I consider 1 or 2 milliseconds completely inconsequential, even in the highest possible level of competition. All the "finish line" instances are so rare as to not matter. I'd still be interested in a scientific analysis on how often the "finish line effect" occurs though. I suspect the answer is "virtually never", but that's just me.
You are entirely correct. There are just too many factors and variables in place (human related factors, game/engine related factors, internet related factors) for a 1 or 2 millisecond to matter. But it doesn't mean that we shouldn't pay attention even when talking about few milliseconds differences. In the end, faster = better, and as i mentioned above every little improvement is still progress.
KindOldRaven wrote: Every-single-day I see people tell others to avoid Gsync/Freesync because of the 'insane amount of input lag it induces, completely unusable for competitive PUBG' and stuff like that... :roll: Ah well, I'm very happy to have found this community :D
Well, without trying to insult anyone. PUBG is a definition of mediocrity... skill wise. Words like "competitive" and "pro" when talking about PUBG shouldn't be even used when talking about this game, because this is an insult to other games where an actual skill matters the most. And even if we would not talk about players skill in this game, there is so much wrong with this game, including engine/network code. Also, the gameplay itself where playing field is never equal and luck also matters based on loot that you will find (plane drops too), including energy field that might work in your favor or in favor of other teams and players.

Re: Experience & Opinion: 240hz displays are blurry

Posted: 16 Apr 2018, 14:40
by Chief Blur Buster
Fun discussion.

Certainly environmental factors (e.g. network, game tickrates, etc) will certainly statistically overwhelm the millisecond or two, and skills differences will massively overwhelm too. It's all very statistically hard to pull the signal out of the noise without speciallized tests -- but it's more of a "didn't know millisecond had a factor, even if only occaional throughout the day" (agree) than a "rarely millisecond ever has a factor" (totally disagree).

With reaction time spreads of probably under 40ms between slowest and fastest reacting CS:GO players 4ms is a massive 10% difference in the reaction-time spread of an elite team (say, reaction time of fastest player is "X ms" and reaction time of slowest player is "X + 40"ms hypothetically -- might be tighter or bigger spread).

Even a 2ms difference increases by 25% chance that your reaction rounds off to an earlier tick on a 128-tick CS:GO server. (1/128 = ~8ms ... and 25% of is 2ms). 4ms reduced equipment lag makes it pretty much a coinflip (50-50) your reaction may round off to a earlier CS:GO tick (8ms granularity), where your action gets treated as first reaction. And even though 4ms is sub-128-tick, it still a potential tick roundoff granularity, since game engines can't predictively compensate for lower monitor lag (and other equipment), to automatically handicap/level-the-playing-field.

Now, it is logical to surmise that the environment/tactics play a role on how tight the reaction spreads needs to become before milliseconds mattter: One game of alleyways, corners, doors, simultaneous-see events are quite common, creating many multiple different tight-reaction-race in a single FPS game, so "sprints-to-finish" (see-simultaneously-and-react events) happen far more often in certain games. It doesn't look like both react simultaneously, only one is instantly fragged, nobody thinks twice about if it was ever a simultaneous attempt to shoot each other. But when a door is opened or when you turn a corner, it's literally a simultaneous see-each-other. Trigger for both of you to be the first to be fragged -- so both of your reaction-time are triggered simultaneously in those particular types of situations -- in the FPS games that forces more of those simultaneous-see events.

It happens less often in arena games (wide open spaces) than corridors-corners-doors-alleyways type games (more sudden-see-simultaneous events for simultaneous-reaciton-time attempts), so it's also hugely game dependent, and you can't just pull amateurs off the street for a reaction-time study (e.g. big-spread of reaction-times not representative of high-end eSports venues, LAN environments, and highly similar equipment). Sure, sure, it's probably far sub "1%" of situations where milliseconds begin to much more frequently matter in a specific venue. And yes, you have to pretty much work hard (as a researcher) to try to recruit the proper players for studies, especially if they're chasing after professional venues rather than free volunteer research (unless incentivized as such). And study on those specific kinds of gaming tactics where milliseconds matter more than other gaming tactics. Previous studies have been weak in targetting those specific convergences that laser-focusses the tight-spreads (trained pros + close-quarter-surprises maps). And you've got to punch through false claims and FUD noise, and get to the real signal. Not easy. But regardless, you get the picture of an open researcher mind. Many surprises still lurk...

There are certainly times where it makes no bleep of a difference, far below the noise floor. No disagreement. And too much unnecessary end-user focus on the millisecond? Sure, no disagreement there either. And a great many of us willingly trade a few milliseconds for other advantages like strobing, image quality, improved VRR, and other things, and sometimes that indeed improves our game far more than the small loss.

What I take issue is assumptions of "milliseconds rarely matter EVER" to all kinds of gaming tactics, for each one, for every single one of them, universally. Sometimes it's hiding for decades in plain sight (like display persistence) but was not more widely well understood until more recently (and only barely, thanks to Blur Busters and TestUFO!). One needs to recognize that there appears to be some specific gaming tactics where it seems far more critical; it require a research understanding of which type of situations where the tight-reaction-spreads occurs, in which kinds of games, for which kinds of gaming tactics...

So, on the 'outside factors' and crappy netcode:

Battle(non)sense (guest writer article at BlurBusters.com) runs a YouTube channel that analyzes netcode related latencies and such. Subscribe to him to help lift all boats of all crappy netcode! Shovelfuls and shovelfuls of hundred-millisecond-latency manure, needs to go. Free fertilizer for your garden.

Either way, great discussion (but still please keep mucho respect to the millisecond though!).

Re: Experience & Opinion: 240hz displays are blurry

Posted: 16 Apr 2018, 23:35
by GammaLyrae
I wasn't happy with the first 240hz monitor I bought. I felt the overdrive was poorly tuned and that I was giving up too much in image quality in order to get the extra frames. I am firmly in the camp that is prepared to give up a few ms in response times for higher quality images that you can observe with the naked eye. I do think there's a balance to be struck, though. I already know I don't like the sluggish rise times of VA panels, even if they offer great contrast and color reproduction at a lower price than IPS panels.

I do wish that sites like tftcentral and rtings were able to review more monitors. Their reviews shed light on just how important the electronics driving a monitor matter. Directly comparing previous generation models like the PG279Q and MG279Q, of which the only appreciable differences at a glance appear to be Gsync vs Freesync, the panels themselves are different, as are the surrounding electronics. The MG279Q has measurably slower response times with more overshoot artifacts compared to the PG279Q. Their tests break past the marketing barrier and offer a peak behind the engineering curtain, which is incredibly valuable information to have. Especially when it comes time to drop hundreds of dollars on something you want to be happy looking at and using for years to come.